Oct 08 12:55:49 crc systemd[1]: Starting Kubernetes Kubelet... Oct 08 12:55:49 crc restorecon[4724]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:49 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 12:55:50 crc restorecon[4724]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 08 12:55:50 crc kubenswrapper[4735]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 12:55:50 crc kubenswrapper[4735]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 08 12:55:50 crc kubenswrapper[4735]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 12:55:50 crc kubenswrapper[4735]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 12:55:50 crc kubenswrapper[4735]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 08 12:55:50 crc kubenswrapper[4735]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.621469 4735 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627234 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627267 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627277 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627287 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627297 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627308 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627317 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627326 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627334 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627342 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627351 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627359 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627367 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627375 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627383 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627398 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627407 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627416 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627424 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627432 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627440 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627448 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627456 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627464 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627472 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627508 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627519 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627528 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627537 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627546 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627554 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627563 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627571 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627578 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627586 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627597 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627608 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627618 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627627 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627636 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627645 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627654 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627663 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627675 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627686 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627694 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627705 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627717 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627726 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627735 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627744 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627754 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627763 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627771 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627779 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627788 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627797 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627806 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627814 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627822 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627829 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627837 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627849 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627857 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627864 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627872 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627880 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627889 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627897 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627904 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.627912 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628890 4735 flags.go:64] FLAG: --address="0.0.0.0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628913 4735 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628930 4735 flags.go:64] FLAG: --anonymous-auth="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628942 4735 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628953 4735 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628962 4735 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628973 4735 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628984 4735 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.628994 4735 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629004 4735 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629014 4735 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629023 4735 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629033 4735 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629043 4735 flags.go:64] FLAG: --cgroup-root="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629052 4735 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629060 4735 flags.go:64] FLAG: --client-ca-file="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629070 4735 flags.go:64] FLAG: --cloud-config="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629079 4735 flags.go:64] FLAG: --cloud-provider="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629088 4735 flags.go:64] FLAG: --cluster-dns="[]" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629099 4735 flags.go:64] FLAG: --cluster-domain="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629108 4735 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629118 4735 flags.go:64] FLAG: --config-dir="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629126 4735 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629137 4735 flags.go:64] FLAG: --container-log-max-files="5" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629149 4735 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629160 4735 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629169 4735 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629179 4735 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629188 4735 flags.go:64] FLAG: --contention-profiling="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629197 4735 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629206 4735 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629216 4735 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629225 4735 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629237 4735 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629246 4735 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629255 4735 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629264 4735 flags.go:64] FLAG: --enable-load-reader="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629273 4735 flags.go:64] FLAG: --enable-server="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629284 4735 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629297 4735 flags.go:64] FLAG: --event-burst="100" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629306 4735 flags.go:64] FLAG: --event-qps="50" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629316 4735 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629325 4735 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629335 4735 flags.go:64] FLAG: --eviction-hard="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629345 4735 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629354 4735 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629362 4735 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629372 4735 flags.go:64] FLAG: --eviction-soft="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629381 4735 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629391 4735 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629399 4735 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629408 4735 flags.go:64] FLAG: --experimental-mounter-path="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629417 4735 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629426 4735 flags.go:64] FLAG: --fail-swap-on="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629440 4735 flags.go:64] FLAG: --feature-gates="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629451 4735 flags.go:64] FLAG: --file-check-frequency="20s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629459 4735 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629470 4735 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629507 4735 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629516 4735 flags.go:64] FLAG: --healthz-port="10248" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629525 4735 flags.go:64] FLAG: --help="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629534 4735 flags.go:64] FLAG: --hostname-override="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629542 4735 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629551 4735 flags.go:64] FLAG: --http-check-frequency="20s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629561 4735 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629569 4735 flags.go:64] FLAG: --image-credential-provider-config="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629577 4735 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629586 4735 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629595 4735 flags.go:64] FLAG: --image-service-endpoint="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629603 4735 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629612 4735 flags.go:64] FLAG: --kube-api-burst="100" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629621 4735 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629630 4735 flags.go:64] FLAG: --kube-api-qps="50" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629639 4735 flags.go:64] FLAG: --kube-reserved="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629648 4735 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629657 4735 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629666 4735 flags.go:64] FLAG: --kubelet-cgroups="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629674 4735 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629683 4735 flags.go:64] FLAG: --lock-file="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629692 4735 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629702 4735 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629711 4735 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629724 4735 flags.go:64] FLAG: --log-json-split-stream="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629846 4735 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629859 4735 flags.go:64] FLAG: --log-text-split-stream="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629868 4735 flags.go:64] FLAG: --logging-format="text" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629880 4735 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629890 4735 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629899 4735 flags.go:64] FLAG: --manifest-url="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629910 4735 flags.go:64] FLAG: --manifest-url-header="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629922 4735 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629933 4735 flags.go:64] FLAG: --max-open-files="1000000" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629944 4735 flags.go:64] FLAG: --max-pods="110" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629954 4735 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629964 4735 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629974 4735 flags.go:64] FLAG: --memory-manager-policy="None" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629984 4735 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.629993 4735 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630001 4735 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630010 4735 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630030 4735 flags.go:64] FLAG: --node-status-max-images="50" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630040 4735 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630049 4735 flags.go:64] FLAG: --oom-score-adj="-999" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630057 4735 flags.go:64] FLAG: --pod-cidr="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630066 4735 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630079 4735 flags.go:64] FLAG: --pod-manifest-path="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630088 4735 flags.go:64] FLAG: --pod-max-pids="-1" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630097 4735 flags.go:64] FLAG: --pods-per-core="0" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630106 4735 flags.go:64] FLAG: --port="10250" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630115 4735 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630123 4735 flags.go:64] FLAG: --provider-id="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630133 4735 flags.go:64] FLAG: --qos-reserved="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630143 4735 flags.go:64] FLAG: --read-only-port="10255" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630151 4735 flags.go:64] FLAG: --register-node="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630160 4735 flags.go:64] FLAG: --register-schedulable="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630168 4735 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630189 4735 flags.go:64] FLAG: --registry-burst="10" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630199 4735 flags.go:64] FLAG: --registry-qps="5" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630211 4735 flags.go:64] FLAG: --reserved-cpus="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630220 4735 flags.go:64] FLAG: --reserved-memory="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630232 4735 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630241 4735 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630250 4735 flags.go:64] FLAG: --rotate-certificates="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630259 4735 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630268 4735 flags.go:64] FLAG: --runonce="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630277 4735 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630286 4735 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630295 4735 flags.go:64] FLAG: --seccomp-default="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630304 4735 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630312 4735 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630321 4735 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630330 4735 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630340 4735 flags.go:64] FLAG: --storage-driver-password="root" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630348 4735 flags.go:64] FLAG: --storage-driver-secure="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630358 4735 flags.go:64] FLAG: --storage-driver-table="stats" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630367 4735 flags.go:64] FLAG: --storage-driver-user="root" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630376 4735 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630385 4735 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630394 4735 flags.go:64] FLAG: --system-cgroups="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630403 4735 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630416 4735 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630425 4735 flags.go:64] FLAG: --tls-cert-file="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630434 4735 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630445 4735 flags.go:64] FLAG: --tls-min-version="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630453 4735 flags.go:64] FLAG: --tls-private-key-file="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630462 4735 flags.go:64] FLAG: --topology-manager-policy="none" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630471 4735 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630504 4735 flags.go:64] FLAG: --topology-manager-scope="container" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630513 4735 flags.go:64] FLAG: --v="2" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630525 4735 flags.go:64] FLAG: --version="false" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630539 4735 flags.go:64] FLAG: --vmodule="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630550 4735 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.630559 4735 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630766 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630778 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630788 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630798 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630807 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630815 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630827 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630837 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630847 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630857 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630866 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630875 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630882 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630891 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630899 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630907 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630915 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630923 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630930 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630938 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630946 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630954 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630961 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630970 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630978 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630985 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.630993 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631001 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631009 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631022 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631032 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631042 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631050 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631058 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631066 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631074 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631082 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631090 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631097 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631106 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631113 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631122 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631129 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631137 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631144 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631152 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631160 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631168 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631176 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631183 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631192 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631199 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631210 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631219 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631227 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631235 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631243 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631252 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631261 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631269 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631277 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631284 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631292 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631300 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631307 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631317 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631325 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631333 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631340 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631348 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.631356 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.631381 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.646003 4735 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.646170 4735 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646407 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646436 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646450 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646463 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646502 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646515 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646536 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646550 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646565 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646579 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646591 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646599 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646663 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.646674 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647007 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647036 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647046 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647055 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647067 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647077 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647086 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647094 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647102 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647111 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647118 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647126 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647134 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647142 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647150 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647159 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647167 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647176 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647186 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647197 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647206 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647215 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647222 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647230 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647238 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647247 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647254 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647262 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647270 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647278 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647286 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647297 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647309 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647318 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647326 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647345 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647355 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647364 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647373 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647382 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647391 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647400 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647408 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647416 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647424 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647432 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647440 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647448 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647455 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647463 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647471 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647523 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647537 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647547 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647557 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647567 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647576 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.647593 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647844 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647857 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647867 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647875 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647884 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647892 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647900 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647908 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647916 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647924 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647932 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647941 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647948 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647956 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647964 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647972 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647980 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647988 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.647995 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648003 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648012 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648020 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648028 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648036 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648043 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648051 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648059 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648068 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648076 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648084 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648091 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648100 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648108 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648115 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648126 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648135 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648145 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648153 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648163 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648171 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648179 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648189 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648200 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648210 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648219 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648228 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648237 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648245 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648253 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648261 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648268 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648277 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648284 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648292 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648302 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648310 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648318 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648326 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648336 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648344 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648352 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648360 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648368 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648376 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648384 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648392 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648402 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648412 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648421 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648430 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.648439 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.648451 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.649965 4735 server.go:940] "Client rotation is on, will bootstrap in background" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.656767 4735 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.656897 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.658665 4735 server.go:997] "Starting client certificate rotation" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.658715 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.659009 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-10 22:41:27.820708914 +0000 UTC Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.659092 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 801h45m37.161621817s for next certificate rotation Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.687116 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.690195 4735 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.713558 4735 log.go:25] "Validated CRI v1 runtime API" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.748980 4735 log.go:25] "Validated CRI v1 image API" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.753021 4735 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.759977 4735 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-08-12-51-16-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.760142 4735 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:44 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.784335 4735 manager.go:217] Machine: {Timestamp:2025-10-08 12:55:50.780187626 +0000 UTC m=+0.588492250 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:14046600-b993-4087-9f7d-570e1efe09a8 BootID:0d020dc1-103d-4051-9317-7ecb0eb982ec Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:44 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ee:dd:70 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ee:dd:70 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:44:37:6f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:db:f6:3d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8b:45:8d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:45:ce:7a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:73:01:eb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:31:ab:29:85:8b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:fa:20:92:91:95 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.784983 4735 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.785284 4735 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.787101 4735 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.787348 4735 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.787396 4735 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.787673 4735 topology_manager.go:138] "Creating topology manager with none policy" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.787685 4735 container_manager_linux.go:303] "Creating device plugin manager" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.788158 4735 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.788183 4735 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.788524 4735 state_mem.go:36] "Initialized new in-memory state store" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.788781 4735 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.795051 4735 kubelet.go:418] "Attempting to sync node with API server" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.795074 4735 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.795100 4735 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.795117 4735 kubelet.go:324] "Adding apiserver pod source" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.795130 4735 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.799240 4735 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.800753 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.803029 4735 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.803841 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.803916 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.803962 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.804003 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804612 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804661 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804677 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804691 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804713 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804727 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804742 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804764 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804780 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804795 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804832 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.804851 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.807514 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.808692 4735 server.go:1280] "Started kubelet" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.811069 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.811507 4735 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 08 12:55:50 crc systemd[1]: Started Kubernetes Kubelet. Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.811522 4735 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.811663 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.811944 4735 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.811995 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:26:51.237288583 +0000 UTC Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.812066 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 954h31m0.425245722s for next certificate rotation Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.812228 4735 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.812720 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.813318 4735 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.813366 4735 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.813530 4735 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.814254 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.814423 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.814529 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.815681 4735 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.815706 4735 factory.go:55] Registering systemd factory Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.815717 4735 factory.go:221] Registration of the systemd container factory successfully Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.816135 4735 factory.go:153] Registering CRI-O factory Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.816183 4735 factory.go:221] Registration of the crio container factory successfully Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.816213 4735 factory.go:103] Registering Raw factory Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.816249 4735 manager.go:1196] Started watching for new ooms in manager Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.823416 4735 manager.go:319] Starting recovery of all containers Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.822086 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c85542328b7c1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 12:55:50.808639425 +0000 UTC m=+0.616943999,LastTimestamp:2025-10-08 12:55:50.808639425 +0000 UTC m=+0.616943999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.828265 4735 server.go:460] "Adding debug handlers to kubelet server" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836244 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836333 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836358 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836379 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836400 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836419 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836441 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836460 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836512 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836534 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.836555 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837704 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837727 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837756 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837778 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837800 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837824 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837843 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837896 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837917 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837936 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837954 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837973 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.837991 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838012 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838029 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838051 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838079 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838109 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838157 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838178 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838196 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838216 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838234 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838254 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838274 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838295 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838314 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838335 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838354 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838373 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838392 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838412 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838432 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838453 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838472 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838519 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838585 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838654 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838675 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838698 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838719 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838746 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838770 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838792 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838815 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838835 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838856 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838877 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838900 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838920 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838939 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838959 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.838980 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839012 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839032 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839052 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839071 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839089 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839113 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839140 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839167 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839185 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839206 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839226 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839244 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839265 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839283 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839303 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839330 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839350 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839369 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839392 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839411 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839429 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839449 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839467 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839515 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839535 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839554 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839572 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839591 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839611 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839630 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839651 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839670 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839691 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839710 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839729 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839748 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839768 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839787 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839805 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839824 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839914 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839938 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839961 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.839983 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840004 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840027 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840049 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840070 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840092 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840112 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840135 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840162 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840188 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840209 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840228 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840247 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840266 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840287 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840306 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840325 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840343 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840361 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840420 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840441 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840462 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840508 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840530 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840550 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840568 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840635 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840659 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840679 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840699 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840721 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840740 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840759 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840784 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840804 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840823 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840845 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840865 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840885 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840906 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840927 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840948 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840973 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.840993 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841017 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841036 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841054 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841074 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841095 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841115 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841137 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841156 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841185 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841214 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841239 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841266 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841287 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841307 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841329 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841349 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841370 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841393 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841413 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841434 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841468 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841516 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841536 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841557 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841578 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841602 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841623 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841643 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841664 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841684 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841704 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841725 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841745 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841766 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841787 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841807 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841826 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841845 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841867 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841908 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841945 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.841991 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842020 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842043 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842064 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842087 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842107 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842130 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842149 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842170 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842189 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842212 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842251 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842282 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.842303 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.845504 4735 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.845574 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.845596 4735 reconstruct.go:97] "Volume reconstruction finished" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.845604 4735 reconciler.go:26] "Reconciler: start to sync state" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.855149 4735 manager.go:324] Recovery completed Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.867568 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.870189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.870245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.870259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.880186 4735 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.880239 4735 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.880280 4735 state_mem.go:36] "Initialized new in-memory state store" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.891109 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.893248 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.893335 4735 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.893372 4735 kubelet.go:2335] "Starting kubelet main sync loop" Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.893444 4735 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 08 12:55:50 crc kubenswrapper[4735]: W1008 12:55:50.895141 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.895231 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.903208 4735 policy_none.go:49] "None policy: Start" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.904282 4735 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.904332 4735 state_mem.go:35] "Initializing new in-memory state store" Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.913572 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.978272 4735 manager.go:334] "Starting Device Plugin manager" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.978362 4735 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.978381 4735 server.go:79] "Starting device plugin registration server" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.979088 4735 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.979115 4735 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.979455 4735 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.979700 4735 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.979708 4735 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 08 12:55:50 crc kubenswrapper[4735]: E1008 12:55:50.993515 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.993605 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.993681 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995253 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995518 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995568 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.995886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996007 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996163 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996203 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996972 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.996987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997125 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997167 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997756 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.997871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998027 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998108 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998550 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.998580 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.999268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.999309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.999320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.999512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.999551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:50 crc kubenswrapper[4735]: I1008 12:55:50.999568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:51 crc kubenswrapper[4735]: E1008 12:55:51.015039 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051611 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051643 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051679 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051797 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.051907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.080321 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.082307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.082369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.082393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.082439 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:55:51 crc kubenswrapper[4735]: E1008 12:55:51.083332 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152812 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152875 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152921 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152942 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152966 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.152986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153007 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153029 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153095 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153184 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153112 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153283 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153282 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153291 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153352 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.153250 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.283736 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.286516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.286583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.286607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.286650 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:55:51 crc kubenswrapper[4735]: E1008 12:55:51.287396 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.335320 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.342365 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.362293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: W1008 12:55:51.388519 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c5356d3d1e1cd2533d3e8bf304a1abf7c1975338ad560208590af94fa6a974f9 WatchSource:0}: Error finding container c5356d3d1e1cd2533d3e8bf304a1abf7c1975338ad560208590af94fa6a974f9: Status 404 returned error can't find the container with id c5356d3d1e1cd2533d3e8bf304a1abf7c1975338ad560208590af94fa6a974f9 Oct 08 12:55:51 crc kubenswrapper[4735]: W1008 12:55:51.390922 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-688fa625af715d3ff82a753c6b605d6ce728ac83afb328de19510992dc5f31ca WatchSource:0}: Error finding container 688fa625af715d3ff82a753c6b605d6ce728ac83afb328de19510992dc5f31ca: Status 404 returned error can't find the container with id 688fa625af715d3ff82a753c6b605d6ce728ac83afb328de19510992dc5f31ca Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.393919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: W1008 12:55:51.394895 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-86bca132b23754f21f19253d2a780f34f326494b286ab95ea66f699a2da72847 WatchSource:0}: Error finding container 86bca132b23754f21f19253d2a780f34f326494b286ab95ea66f699a2da72847: Status 404 returned error can't find the container with id 86bca132b23754f21f19253d2a780f34f326494b286ab95ea66f699a2da72847 Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.405859 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:51 crc kubenswrapper[4735]: W1008 12:55:51.413591 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6e150e39d6a65472debb215cbffc66549584a3eb6b3fa44114f473983f8927fe WatchSource:0}: Error finding container 6e150e39d6a65472debb215cbffc66549584a3eb6b3fa44114f473983f8927fe: Status 404 returned error can't find the container with id 6e150e39d6a65472debb215cbffc66549584a3eb6b3fa44114f473983f8927fe Oct 08 12:55:51 crc kubenswrapper[4735]: E1008 12:55:51.415893 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Oct 08 12:55:51 crc kubenswrapper[4735]: W1008 12:55:51.428106 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9c0e173302f9a52201d033cd92ec344ed4357f12af9ab41e569db7466d177b8b WatchSource:0}: Error finding container 9c0e173302f9a52201d033cd92ec344ed4357f12af9ab41e569db7466d177b8b: Status 404 returned error can't find the container with id 9c0e173302f9a52201d033cd92ec344ed4357f12af9ab41e569db7466d177b8b Oct 08 12:55:51 crc kubenswrapper[4735]: W1008 12:55:51.632020 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:51 crc kubenswrapper[4735]: E1008 12:55:51.632122 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.688216 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.689421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.689462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.689522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.689557 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:55:51 crc kubenswrapper[4735]: E1008 12:55:51.689976 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.812071 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.898094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"86bca132b23754f21f19253d2a780f34f326494b286ab95ea66f699a2da72847"} Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.899468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"688fa625af715d3ff82a753c6b605d6ce728ac83afb328de19510992dc5f31ca"} Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.900684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c5356d3d1e1cd2533d3e8bf304a1abf7c1975338ad560208590af94fa6a974f9"} Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.901820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9c0e173302f9a52201d033cd92ec344ed4357f12af9ab41e569db7466d177b8b"} Oct 08 12:55:51 crc kubenswrapper[4735]: I1008 12:55:51.903023 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e150e39d6a65472debb215cbffc66549584a3eb6b3fa44114f473983f8927fe"} Oct 08 12:55:52 crc kubenswrapper[4735]: W1008 12:55:52.097268 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:52 crc kubenswrapper[4735]: E1008 12:55:52.097746 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:52 crc kubenswrapper[4735]: W1008 12:55:52.147199 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:52 crc kubenswrapper[4735]: E1008 12:55:52.147269 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:52 crc kubenswrapper[4735]: E1008 12:55:52.217357 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Oct 08 12:55:52 crc kubenswrapper[4735]: W1008 12:55:52.325538 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:52 crc kubenswrapper[4735]: E1008 12:55:52.325673 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.490552 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.492795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.492864 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.492885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.492930 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:55:52 crc kubenswrapper[4735]: E1008 12:55:52.493641 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.812395 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.908067 4735 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2" exitCode=0 Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.908200 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.908435 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.909898 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f" exitCode=0 Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.910007 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.910062 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.910782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.910829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.910844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.911572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.911640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.911663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.913112 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c" exitCode=0 Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.913279 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.913272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.914085 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.915321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.915363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.915402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.915526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.915567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.915589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.916301 4735 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2" exitCode=0 Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.916350 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.916409 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.918608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.918661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.918685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.921856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.921905 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.921931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.921951 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6"} Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.922072 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.923638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.923690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:52 crc kubenswrapper[4735]: I1008 12:55:52.923714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.202146 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.812138 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Oct 08 12:55:53 crc kubenswrapper[4735]: E1008 12:55:53.818704 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.927056 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a" exitCode=0 Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.927288 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.927883 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.933116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.933158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.933172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.934105 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.934135 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.935027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.935047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.935057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.938383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.938456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.938470 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.938746 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.944503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.944546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.944556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.949186 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.950771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.950870 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.950873 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.950890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32"} Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.952346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.952386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:53 crc kubenswrapper[4735]: I1008 12:55:53.952403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.093764 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.098112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.098165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.098177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.098205 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:55:54 crc kubenswrapper[4735]: E1008 12:55:54.099054 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.956292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542"} Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.956434 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.957698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.957731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.957743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960330 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3" exitCode=0 Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960411 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960519 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960719 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960736 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960763 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.960790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3"} Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.961274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.961307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.961324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.961767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.961796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.961805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.962030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.962124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.962207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.962255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.962325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:54 crc kubenswrapper[4735]: I1008 12:55:54.962335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.965715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566"} Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.965759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918"} Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.965769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97"} Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.965779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15"} Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.965866 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.965789 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.966755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.966787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:55 crc kubenswrapper[4735]: I1008 12:55:55.966799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.974012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878"} Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.974082 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.974142 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.975106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.975132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.975144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.975445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.975494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:56 crc kubenswrapper[4735]: I1008 12:55:56.975508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.136754 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.299231 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.300944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.301003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.301021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.301055 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.978229 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.978416 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.980041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.980113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.980138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.980163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.980189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:57 crc kubenswrapper[4735]: I1008 12:55:57.980201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.567822 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.568072 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.569329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.569364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.569376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.576252 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.981389 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.982573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.982628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:58 crc kubenswrapper[4735]: I1008 12:55:58.982646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.514812 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.515156 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.516973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.517043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.517068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.859058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.859338 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.860940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.861011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:55:59 crc kubenswrapper[4735]: I1008 12:55:59.861040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.218048 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.218327 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.219911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.219953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.219964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.707143 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.986661 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.988134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.988187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:00 crc kubenswrapper[4735]: I1008 12:56:00.988200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:00 crc kubenswrapper[4735]: E1008 12:56:00.993625 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 12:56:01 crc kubenswrapper[4735]: I1008 12:56:01.274944 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:56:01 crc kubenswrapper[4735]: I1008 12:56:01.275197 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:01 crc kubenswrapper[4735]: I1008 12:56:01.276730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:01 crc kubenswrapper[4735]: I1008 12:56:01.276764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:01 crc kubenswrapper[4735]: I1008 12:56:01.276774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.686232 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.686438 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.688147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.688210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.688221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.693731 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.993634 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.994535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.994601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:02 crc kubenswrapper[4735]: I1008 12:56:02.994663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:04 crc kubenswrapper[4735]: I1008 12:56:04.275755 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 12:56:04 crc kubenswrapper[4735]: I1008 12:56:04.275848 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 12:56:04 crc kubenswrapper[4735]: W1008 12:56:04.357589 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 12:56:04 crc kubenswrapper[4735]: I1008 12:56:04.357730 4735 trace.go:236] Trace[1910441827]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 12:55:54.355) (total time: 10001ms): Oct 08 12:56:04 crc kubenswrapper[4735]: Trace[1910441827]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:56:04.357) Oct 08 12:56:04 crc kubenswrapper[4735]: Trace[1910441827]: [10.001963819s] [10.001963819s] END Oct 08 12:56:04 crc kubenswrapper[4735]: E1008 12:56:04.357789 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 12:56:04 crc kubenswrapper[4735]: W1008 12:56:04.728369 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 12:56:04 crc kubenswrapper[4735]: I1008 12:56:04.728573 4735 trace.go:236] Trace[1778904091]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 12:55:54.726) (total time: 10001ms): Oct 08 12:56:04 crc kubenswrapper[4735]: Trace[1778904091]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:56:04.728) Oct 08 12:56:04 crc kubenswrapper[4735]: Trace[1778904091]: [10.001620231s] [10.001620231s] END Oct 08 12:56:04 crc kubenswrapper[4735]: E1008 12:56:04.728608 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 12:56:04 crc kubenswrapper[4735]: I1008 12:56:04.814439 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 08 12:56:05 crc kubenswrapper[4735]: I1008 12:56:05.033667 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 12:56:05 crc kubenswrapper[4735]: I1008 12:56:05.033774 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 12:56:05 crc kubenswrapper[4735]: I1008 12:56:05.044660 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 12:56:05 crc kubenswrapper[4735]: I1008 12:56:05.044735 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 12:56:07 crc kubenswrapper[4735]: I1008 12:56:07.144563 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:56:07 crc kubenswrapper[4735]: I1008 12:56:07.144806 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:07 crc kubenswrapper[4735]: I1008 12:56:07.146652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:07 crc kubenswrapper[4735]: I1008 12:56:07.146723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:07 crc kubenswrapper[4735]: I1008 12:56:07.146748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:07 crc kubenswrapper[4735]: I1008 12:56:07.152081 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:56:08 crc kubenswrapper[4735]: I1008 12:56:08.007895 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:08 crc kubenswrapper[4735]: I1008 12:56:08.009937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:08 crc kubenswrapper[4735]: I1008 12:56:08.010044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:08 crc kubenswrapper[4735]: I1008 12:56:08.010072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:09 crc kubenswrapper[4735]: I1008 12:56:09.900339 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.035177 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.038061 4735 trace.go:236] Trace[132816911]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 12:55:55.296) (total time: 14741ms): Oct 08 12:56:10 crc kubenswrapper[4735]: Trace[132816911]: ---"Objects listed" error: 14741ms (12:56:10.037) Oct 08 12:56:10 crc kubenswrapper[4735]: Trace[132816911]: [14.741397758s] [14.741397758s] END Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.038098 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.041618 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.041963 4735 trace.go:236] Trace[2007776968]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 12:55:55.315) (total time: 14726ms): Oct 08 12:56:10 crc kubenswrapper[4735]: Trace[2007776968]: ---"Objects listed" error: 14726ms (12:56:10.041) Oct 08 12:56:10 crc kubenswrapper[4735]: Trace[2007776968]: [14.726400563s] [14.726400563s] END Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.041990 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.043564 4735 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.089867 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60266->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.089890 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60270->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.089941 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60266->192.168.126.11:17697: read: connection reset by peer" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.089999 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60270->192.168.126.11:17697: read: connection reset by peer" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.090450 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.090608 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.246405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.246592 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.247686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.247724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.247735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.260429 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.705883 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.807947 4735 apiserver.go:52] "Watching apiserver" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.810658 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.810983 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.811500 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.811557 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.811730 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.811839 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.812144 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.812184 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.812242 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.812610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.812704 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.814451 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.814975 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.815049 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.815181 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.815856 4735 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.816933 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.817196 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.817226 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.819191 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.819649 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846623 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846668 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846691 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846709 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846729 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846745 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846837 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846870 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846888 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846904 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.846944 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847069 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847093 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847112 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847127 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847143 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847158 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847173 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847169 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847188 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847291 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847318 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847342 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847395 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847443 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847467 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847514 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847540 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847564 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847611 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847639 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847662 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847709 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847731 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847755 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847779 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847805 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847829 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847877 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847900 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847945 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847968 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847995 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848143 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848191 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848212 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848234 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848257 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848279 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848301 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848347 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848396 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848420 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848444 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848492 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848517 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848584 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848635 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848681 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848706 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848729 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848756 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848779 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848878 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848957 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848980 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847661 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.847749 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848073 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848121 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848301 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848385 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855657 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855722 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848541 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848614 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848817 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848904 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848941 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.848974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.849010 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.849027 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:56:11.348978251 +0000 UTC m=+21.157282795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.849266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.850107 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.850423 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.850627 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.850877 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.850949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.850815 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851205 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851665 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.851973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852103 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852517 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852589 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852880 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.852975 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853390 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853390 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853540 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853595 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853708 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.853900 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.854971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.854916 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855126 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855272 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.855801 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856138 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856177 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856179 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856208 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856323 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856377 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856460 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856529 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856566 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857791 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857925 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857947 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857980 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858009 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858074 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858241 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858310 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858335 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858362 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860396 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860470 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860545 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860572 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860628 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860658 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860691 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860718 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860750 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860812 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860865 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860984 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861007 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861062 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861094 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861120 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861151 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861178 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861202 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861229 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861282 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861310 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861339 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861451 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861493 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861526 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861555 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.856437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861584 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861630 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857569 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857577 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857598 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861662 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861689 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861809 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861835 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861890 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861977 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862010 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862096 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862126 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862208 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862237 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862261 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862291 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862318 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862340 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862544 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862777 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862841 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862873 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863345 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863381 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863452 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863496 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863528 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863566 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863633 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863665 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863694 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864324 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864366 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864429 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864839 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865013 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865112 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865776 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865923 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865953 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865981 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.865998 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866112 4735 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866136 4735 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866156 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.867538 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868226 4735 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868741 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868769 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868790 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868804 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868818 4735 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868833 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868850 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868864 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.869672 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.869803 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870614 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870647 4735 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.857966 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858023 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858500 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858585 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.858762 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.859518 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.859545 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.859906 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860067 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871880 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.872526 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860198 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860304 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860570 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.860806 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861077 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861292 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861561 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.861879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862264 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862733 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862770 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862792 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.862860 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.863247 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864080 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864331 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864698 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864743 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866228 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866290 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866613 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866652 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866763 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866918 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.866991 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.867587 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.867665 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.867759 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.867805 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.867915 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868583 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868635 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868817 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.868935 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.869243 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.869638 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.870187 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.869509 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.864633 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870202 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.873691 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:11.373450296 +0000 UTC m=+21.181754840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870889 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870960 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870981 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.870991 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871009 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871512 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871237 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.872672 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.873972 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:11.373947878 +0000 UTC m=+21.182252432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.871000 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.879941 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.879960 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.879971 4735 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.879985 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.879996 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880007 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880019 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880030 4735 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880045 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.870897 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.859920 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880058 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880301 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880231 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880313 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880329 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880779 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880811 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880831 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880847 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880866 4735 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880883 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880902 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880920 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880935 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880951 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880967 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880982 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.880999 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881014 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881030 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881045 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881059 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881073 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881090 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881105 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881119 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881135 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881150 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881164 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881177 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881180 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881192 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881255 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881272 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881286 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881299 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881310 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881321 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881332 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881344 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881357 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881353 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881368 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881375 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.881801 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.882467 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.882988 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.883300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.883552 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.883581 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.883599 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.883669 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:11.383648698 +0000 UTC m=+21.191953482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.883711 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.884382 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.884690 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.884825 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.885396 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.885599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.885616 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.885680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.885931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.886565 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.886628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.886872 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.886939 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.887152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.887408 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.887778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.888643 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.888730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.888960 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.888983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.889900 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.889973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.891104 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.891159 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.892080 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.893213 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.893615 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.893893 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.894132 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.894249 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.894454 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.894596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.896274 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.896668 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.899070 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.900079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.900335 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.900443 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.901013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.900968 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.901441 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.901556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.901632 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.902520 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.902615 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.903926 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.904731 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.904764 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.904784 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:10 crc kubenswrapper[4735]: E1008 12:56:10.904854 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:11.404835081 +0000 UTC m=+21.213139825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.906392 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.911703 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.911873 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.912733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.912856 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.915550 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.915770 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.917764 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.919020 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.920224 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.920912 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.921620 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.923644 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.924643 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.925290 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.925319 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.927263 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.928070 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.929401 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.930108 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.930727 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.932263 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.933070 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.933642 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.935140 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.935711 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.936529 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.937052 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.937305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.937797 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.939689 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.940298 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.941288 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.941762 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.942230 4735 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.942332 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.944610 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.945135 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.946329 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.948047 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.948103 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.949381 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.949997 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.951394 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.952102 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.953059 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.953671 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.954650 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.955762 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.956258 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.957304 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.957961 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.958173 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.959336 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.959890 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.960435 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.961320 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.961866 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.963178 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.963733 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.968468 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.978160 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982565 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982676 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982688 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982698 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982707 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982716 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982725 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982735 4735 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982744 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982753 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982762 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982771 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982779 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982789 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982798 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982806 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982816 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982826 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982834 4735 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982842 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982851 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982860 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982870 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982879 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982888 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982897 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982911 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982920 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982927 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982937 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982945 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982953 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982964 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982973 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982982 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.982991 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983000 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983009 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983017 4735 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983028 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983036 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983045 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983053 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983062 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983072 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983081 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983114 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983122 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983130 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983138 4735 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983147 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983157 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983167 4735 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983175 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983184 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983193 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983201 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983211 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983219 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983227 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983236 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983244 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983252 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983261 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983270 4735 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983279 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983287 4735 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983295 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983303 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983311 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983319 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983327 4735 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983336 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983344 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983352 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983360 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983368 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983377 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983386 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983395 4735 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983403 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983411 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983420 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983429 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983436 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983445 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983455 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983464 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983485 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983493 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983502 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983510 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983519 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983528 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983537 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983546 4735 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983554 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983563 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983571 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983580 4735 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983589 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983598 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983607 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983621 4735 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983630 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983638 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983647 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983655 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983663 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983671 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983680 4735 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983688 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983697 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983705 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983714 4735 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983722 4735 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983730 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983739 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983747 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983757 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983766 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983775 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983783 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983791 4735 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983799 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983807 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.983816 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.984929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.985108 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:10 crc kubenswrapper[4735]: I1008 12:56:10.988081 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.000253 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.010940 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.017906 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.020101 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542" exitCode=255 Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.020170 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542"} Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.027087 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.032177 4735 scope.go:117] "RemoveContainer" containerID="0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.032771 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.036898 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.037449 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.049652 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.061466 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.069557 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.080966 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.091577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.132712 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.137492 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.146697 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 12:56:11 crc kubenswrapper[4735]: W1008 12:56:11.149311 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1127e75a690296be8cffdf25b7997a2c09ae89c39930f2ddfccbbd1269b2befe WatchSource:0}: Error finding container 1127e75a690296be8cffdf25b7997a2c09ae89c39930f2ddfccbbd1269b2befe: Status 404 returned error can't find the container with id 1127e75a690296be8cffdf25b7997a2c09ae89c39930f2ddfccbbd1269b2befe Oct 08 12:56:11 crc kubenswrapper[4735]: W1008 12:56:11.156644 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1ed4798c9c161ba5bfb8887776d21fb21d3da8bfc1021c3e99647d32cdf6e3f2 WatchSource:0}: Error finding container 1ed4798c9c161ba5bfb8887776d21fb21d3da8bfc1021c3e99647d32cdf6e3f2: Status 404 returned error can't find the container with id 1ed4798c9c161ba5bfb8887776d21fb21d3da8bfc1021c3e99647d32cdf6e3f2 Oct 08 12:56:11 crc kubenswrapper[4735]: W1008 12:56:11.174810 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-21d9c25588881cc77448435edb39d46fee7aef9f95923208113ad3574716b81d WatchSource:0}: Error finding container 21d9c25588881cc77448435edb39d46fee7aef9f95923208113ad3574716b81d: Status 404 returned error can't find the container with id 21d9c25588881cc77448435edb39d46fee7aef9f95923208113ad3574716b81d Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.279759 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.283916 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.297042 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.298771 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.320208 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.339706 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.362617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.375029 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.384863 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.387056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.387123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.387144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387209 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:56:12.387185951 +0000 UTC m=+22.195490505 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387243 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.387252 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387277 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:12.387269693 +0000 UTC m=+22.195574237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387387 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387404 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387416 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387449 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:12.387439427 +0000 UTC m=+22.195743981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387509 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.387539 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:12.38753101 +0000 UTC m=+22.195835554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.405539 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.420337 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.430487 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.443398 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.453663 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.470145 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.481176 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.487711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.487871 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.487908 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.487932 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:11 crc kubenswrapper[4735]: E1008 12:56:11.487986 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:12.487969942 +0000 UTC m=+22.296274496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.508691 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.530087 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jnmqc"] Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.530575 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.532179 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.533851 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.536532 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.542034 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.558545 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.577291 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.635288 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.659508 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.682693 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.688620 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bed550b4-694a-4a43-99fc-f869224d6cca-hosts-file\") pod \"node-resolver-jnmqc\" (UID: \"bed550b4-694a-4a43-99fc-f869224d6cca\") " pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.688653 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qjms\" (UniqueName: \"kubernetes.io/projected/bed550b4-694a-4a43-99fc-f869224d6cca-kube-api-access-7qjms\") pod \"node-resolver-jnmqc\" (UID: \"bed550b4-694a-4a43-99fc-f869224d6cca\") " pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.716054 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.731926 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.747188 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.758514 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.776783 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.789165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bed550b4-694a-4a43-99fc-f869224d6cca-hosts-file\") pod \"node-resolver-jnmqc\" (UID: \"bed550b4-694a-4a43-99fc-f869224d6cca\") " pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.789215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qjms\" (UniqueName: \"kubernetes.io/projected/bed550b4-694a-4a43-99fc-f869224d6cca-kube-api-access-7qjms\") pod \"node-resolver-jnmqc\" (UID: \"bed550b4-694a-4a43-99fc-f869224d6cca\") " pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.789318 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bed550b4-694a-4a43-99fc-f869224d6cca-hosts-file\") pod \"node-resolver-jnmqc\" (UID: \"bed550b4-694a-4a43-99fc-f869224d6cca\") " pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.792018 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.801027 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.809715 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qjms\" (UniqueName: \"kubernetes.io/projected/bed550b4-694a-4a43-99fc-f869224d6cca-kube-api-access-7qjms\") pod \"node-resolver-jnmqc\" (UID: \"bed550b4-694a-4a43-99fc-f869224d6cca\") " pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: I1008 12:56:11.844803 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jnmqc" Oct 08 12:56:11 crc kubenswrapper[4735]: W1008 12:56:11.897762 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbed550b4_694a_4a43_99fc_f869224d6cca.slice/crio-d7634fafe1608b783fb60de2e99e89713638c67cbd44d66349461a05309ed295 WatchSource:0}: Error finding container d7634fafe1608b783fb60de2e99e89713638c67cbd44d66349461a05309ed295: Status 404 returned error can't find the container with id d7634fafe1608b783fb60de2e99e89713638c67cbd44d66349461a05309ed295 Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.012211 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-9qwgd"] Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.012584 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.014197 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.014259 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.014542 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.014993 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.016087 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.023856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"21d9c25588881cc77448435edb39d46fee7aef9f95923208113ad3574716b81d"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.024822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jnmqc" event={"ID":"bed550b4-694a-4a43-99fc-f869224d6cca","Type":"ContainerStarted","Data":"d7634fafe1608b783fb60de2e99e89713638c67cbd44d66349461a05309ed295"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.026882 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.026919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.026932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1ed4798c9c161ba5bfb8887776d21fb21d3da8bfc1021c3e99647d32cdf6e3f2"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.028042 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.028832 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.028889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1127e75a690296be8cffdf25b7997a2c09ae89c39930f2ddfccbbd1269b2befe"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.031162 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.037287 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.037317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19"} Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.053048 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.067517 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.088154 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.092268 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2dfd0474-f64a-4d3b-9990-df182d079b10-mcd-auth-proxy-config\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.092319 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl6np\" (UniqueName: \"kubernetes.io/projected/2dfd0474-f64a-4d3b-9990-df182d079b10-kube-api-access-rl6np\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.092340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2dfd0474-f64a-4d3b-9990-df182d079b10-proxy-tls\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.092368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2dfd0474-f64a-4d3b-9990-df182d079b10-rootfs\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.104132 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.113107 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.127415 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.153508 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.173384 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.192887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2dfd0474-f64a-4d3b-9990-df182d079b10-mcd-auth-proxy-config\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.193433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl6np\" (UniqueName: \"kubernetes.io/projected/2dfd0474-f64a-4d3b-9990-df182d079b10-kube-api-access-rl6np\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.193513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2dfd0474-f64a-4d3b-9990-df182d079b10-proxy-tls\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.194032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2dfd0474-f64a-4d3b-9990-df182d079b10-rootfs\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.194117 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2dfd0474-f64a-4d3b-9990-df182d079b10-rootfs\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.212826 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.229603 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2dfd0474-f64a-4d3b-9990-df182d079b10-mcd-auth-proxy-config\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.233284 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.233582 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl6np\" (UniqueName: \"kubernetes.io/projected/2dfd0474-f64a-4d3b-9990-df182d079b10-kube-api-access-rl6np\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.233425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2dfd0474-f64a-4d3b-9990-df182d079b10-proxy-tls\") pod \"machine-config-daemon-9qwgd\" (UID: \"2dfd0474-f64a-4d3b-9990-df182d079b10\") " pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.247543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.262602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.280808 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.302510 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.313010 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.325301 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:56:12 crc kubenswrapper[4735]: W1008 12:56:12.337986 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dfd0474_f64a_4d3b_9990_df182d079b10.slice/crio-e98b994405e3d67933262c20deab57e742c42f1e7ef381789ecd7399ae900c43 WatchSource:0}: Error finding container e98b994405e3d67933262c20deab57e742c42f1e7ef381789ecd7399ae900c43: Status 404 returned error can't find the container with id e98b994405e3d67933262c20deab57e742c42f1e7ef381789ecd7399ae900c43 Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.338270 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.365086 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.382973 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.395819 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.395973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.396008 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.396027 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396088 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:56:14.396050622 +0000 UTC m=+24.204355326 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396142 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396182 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396220 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:14.396203535 +0000 UTC m=+24.204508079 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396190 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396320 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396338 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396282 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:14.396258587 +0000 UTC m=+24.204563131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.396392 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:14.39638287 +0000 UTC m=+24.204687414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.400720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.425006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.428779 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wckcs"] Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.429532 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.432138 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tsff9"] Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.432663 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tfdqx"] Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.432836 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.433899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.440891 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.443457 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.443745 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.443910 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.444045 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.444168 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.444266 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.444366 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.444719 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.444773 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.446752 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.446775 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.446940 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.447001 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.447123 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.480696 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.496863 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.497117 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.497331 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.497420 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.497582 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:14.49755512 +0000 UTC m=+24.305859664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.520720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.541737 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.587096 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.597995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-multus-certs\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598037 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-systemd\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598058 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-netd\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598096 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-cnibin\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-cni-binary-copy\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-slash\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-env-overrides\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-os-release\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-daemon-config\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-systemd-units\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-config\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-script-lib\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-k8s-cni-cncf-io\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-netns\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-kubelet\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-hostroot\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598509 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-conf-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-cnibin\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-etc-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598612 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hmnq\" (UniqueName: \"kubernetes.io/projected/c4653370-cd31-4487-8b63-50b145f55853-kube-api-access-9hmnq\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598683 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-cni-bin\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598701 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-os-release\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-cni-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598735 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-etc-kubernetes\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4653370-cd31-4487-8b63-50b145f55853-ovn-node-metrics-cert\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-socket-dir-parent\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598850 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-bin\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-ovn-kubernetes\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598886 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qwbk\" (UniqueName: \"kubernetes.io/projected/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-kube-api-access-7qwbk\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d0464374-65ce-46f6-9e7e-79018b92aff4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598918 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-netns\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-var-lib-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-ovn\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.598990 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-log-socket\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599014 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-system-cni-dir\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-cni-multus\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599080 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-system-cni-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-node-log\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599133 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0464374-65ce-46f6-9e7e-79018b92aff4-cni-binary-copy\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwrg\" (UniqueName: \"kubernetes.io/projected/d0464374-65ce-46f6-9e7e-79018b92aff4-kube-api-access-lzwrg\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.599166 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-kubelet\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.621868 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.659786 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-multus-certs\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-cni-binary-copy\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-systemd\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-netd\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700689 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-cnibin\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700733 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-os-release\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-slash\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-env-overrides\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700776 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700817 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-netd\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700845 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-cnibin\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-daemon-config\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700909 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-slash\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700943 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-systemd-units\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-config\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-script-lib\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.700986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-systemd\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-systemd-units\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-cnibin\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701029 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-cnibin\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-os-release\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-k8s-cni-cncf-io\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-netns\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-k8s-cni-cncf-io\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-kubelet\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701259 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-kubelet\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-netns\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-hostroot\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-conf-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701342 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-etc-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-cni-bin\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-conf-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hmnq\" (UniqueName: \"kubernetes.io/projected/c4653370-cd31-4487-8b63-50b145f55853-kube-api-access-9hmnq\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-hostroot\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-etc-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-os-release\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701455 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-cni-bin\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701487 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-cni-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701526 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-etc-kubernetes\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-cni-binary-copy\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4653370-cd31-4487-8b63-50b145f55853-ovn-node-metrics-cert\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-socket-dir-parent\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701598 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-etc-kubernetes\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701610 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-bin\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-env-overrides\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qwbk\" (UniqueName: \"kubernetes.io/projected/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-kube-api-access-7qwbk\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-ovn-kubernetes\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-bin\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d0464374-65ce-46f6-9e7e-79018b92aff4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-ovn-kubernetes\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701752 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-socket-dir-parent\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-script-lib\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701544 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-os-release\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-cni-multus\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-netns\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-cni-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-netns\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701850 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701851 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-var-lib-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701863 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-config\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701871 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-var-lib-openvswitch\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-var-lib-cni-multus\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-ovn\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701982 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-ovn\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.701982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-log-socket\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702007 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-system-cni-dir\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-log-socket\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702092 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0464374-65ce-46f6-9e7e-79018b92aff4-system-cni-dir\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702143 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-system-cni-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-node-log\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702201 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0464374-65ce-46f6-9e7e-79018b92aff4-cni-binary-copy\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-node-log\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwrg\" (UniqueName: \"kubernetes.io/projected/d0464374-65ce-46f6-9e7e-79018b92aff4-kube-api-access-lzwrg\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-system-cni-dir\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-kubelet\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702290 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-kubelet\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-multus-daemon-config\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702364 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-host-run-multus-certs\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702776 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d0464374-65ce-46f6-9e7e-79018b92aff4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.702983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0464374-65ce-46f6-9e7e-79018b92aff4-cni-binary-copy\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.705030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4653370-cd31-4487-8b63-50b145f55853-ovn-node-metrics-cert\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.724181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hmnq\" (UniqueName: \"kubernetes.io/projected/c4653370-cd31-4487-8b63-50b145f55853-kube-api-access-9hmnq\") pod \"ovnkube-node-wckcs\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.743601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.746253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qwbk\" (UniqueName: \"kubernetes.io/projected/ef6f0aff-a0e5-477a-9392-71bc49bfa7aa-kube-api-access-7qwbk\") pod \"multus-tfdqx\" (UID: \"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\") " pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.776425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tfdqx" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.786194 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwrg\" (UniqueName: \"kubernetes.io/projected/d0464374-65ce-46f6-9e7e-79018b92aff4-kube-api-access-lzwrg\") pod \"multus-additional-cni-plugins-tsff9\" (UID: \"d0464374-65ce-46f6-9e7e-79018b92aff4\") " pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:12 crc kubenswrapper[4735]: W1008 12:56:12.793833 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef6f0aff_a0e5_477a_9392_71bc49bfa7aa.slice/crio-9b6c7119a2c39513d44420935ddccc3e8f7e40a1251eca6ba75666287a783c9c WatchSource:0}: Error finding container 9b6c7119a2c39513d44420935ddccc3e8f7e40a1251eca6ba75666287a783c9c: Status 404 returned error can't find the container with id 9b6c7119a2c39513d44420935ddccc3e8f7e40a1251eca6ba75666287a783c9c Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.799790 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.838268 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.879674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.893981 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.893981 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.894115 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.894170 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.894000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:12 crc kubenswrapper[4735]: E1008 12:56:12.894264 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.897715 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.917829 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:12 crc kubenswrapper[4735]: I1008 12:56:12.960638 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.000299 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:12Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.039005 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.041008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.041061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.041076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"e98b994405e3d67933262c20deab57e742c42f1e7ef381789ecd7399ae900c43"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.042598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jnmqc" event={"ID":"bed550b4-694a-4a43-99fc-f869224d6cca","Type":"ContainerStarted","Data":"b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.043879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerStarted","Data":"33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.043910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerStarted","Data":"9b6c7119a2c39513d44420935ddccc3e8f7e40a1251eca6ba75666287a783c9c"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.045300 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" exitCode=0 Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.045955 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.045985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"65557cc3370cfa21b560bb9b626c7859da246de7ce6ca40443e1098741e4ca9b"} Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.056891 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tsff9" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.079403 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.124995 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.159526 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.199602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.248367 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.294613 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.319777 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.358814 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.405021 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.446733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.489275 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.525010 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.563157 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.596036 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.637553 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.679416 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.719819 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.757382 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.804209 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.840541 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.889810 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.925085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.934450 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qdvjc"] Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.934945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.961281 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:13Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.970598 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 08 12:56:13 crc kubenswrapper[4735]: I1008 12:56:13.990775 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.010607 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.015231 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28749736-d31e-40f2-84e8-a7839d8da376-host\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.015387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/28749736-d31e-40f2-84e8-a7839d8da376-serviceca\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.015430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcv62\" (UniqueName: \"kubernetes.io/projected/28749736-d31e-40f2-84e8-a7839d8da376-kube-api-access-xcv62\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.030858 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.053996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.058400 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.058441 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.058460 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.058499 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.059905 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0464374-65ce-46f6-9e7e-79018b92aff4" containerID="e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3" exitCode=0 Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.060256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerDied","Data":"e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.060292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerStarted","Data":"3ba4d3460eba5c830e20ea060149d7ffaf077274c7922da9614e8ec90ad6472a"} Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.077686 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.118509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28749736-d31e-40f2-84e8-a7839d8da376-host\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.118565 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/28749736-d31e-40f2-84e8-a7839d8da376-serviceca\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.118583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcv62\" (UniqueName: \"kubernetes.io/projected/28749736-d31e-40f2-84e8-a7839d8da376-kube-api-access-xcv62\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.118624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28749736-d31e-40f2-84e8-a7839d8da376-host\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.118885 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.119460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/28749736-d31e-40f2-84e8-a7839d8da376-serviceca\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.152328 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcv62\" (UniqueName: \"kubernetes.io/projected/28749736-d31e-40f2-84e8-a7839d8da376-kube-api-access-xcv62\") pod \"node-ca-qdvjc\" (UID: \"28749736-d31e-40f2-84e8-a7839d8da376\") " pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.180386 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.218810 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.260332 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.305097 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.317117 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qdvjc" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.343359 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.377962 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.419084 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.421637 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.421776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.421843 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:56:18.421813822 +0000 UTC m=+28.230118376 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.421904 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.421905 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.421922 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.421935 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.421972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.422012 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:18.422002587 +0000 UTC m=+28.230307141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.422022 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.422073 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:18.422061469 +0000 UTC m=+28.230366013 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.422080 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.422113 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:18.42210533 +0000 UTC m=+28.230409894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.457777 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.503275 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.522795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.522968 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.522991 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.523004 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.523063 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:18.523048394 +0000 UTC m=+28.331352938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.540090 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.579888 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.617805 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.658156 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.698605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.739236 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.795462 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.831146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.856307 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:14Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.894636 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.894757 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.894635 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:14 crc kubenswrapper[4735]: I1008 12:56:14.894657 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.894929 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:14 crc kubenswrapper[4735]: E1008 12:56:14.894937 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.066948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.066994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.068786 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0464374-65ce-46f6-9e7e-79018b92aff4" containerID="48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7" exitCode=0 Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.068831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerDied","Data":"48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7"} Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.070686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qdvjc" event={"ID":"28749736-d31e-40f2-84e8-a7839d8da376","Type":"ContainerStarted","Data":"680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a"} Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.070734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qdvjc" event={"ID":"28749736-d31e-40f2-84e8-a7839d8da376","Type":"ContainerStarted","Data":"8efa95fb21ec1f5dfd8ff30d35688bb34b0485b5ca4fe1d930d08437b58f61bb"} Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.083208 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.096168 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.113890 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.130224 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.150153 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.166299 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.179821 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.190751 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.220519 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.260436 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.299161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.336998 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.378687 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.419245 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.458539 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.498354 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.537576 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.584774 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.619977 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.663955 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.698728 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.738320 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.783092 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.818418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.860413 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.897974 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.936787 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:15 crc kubenswrapper[4735]: I1008 12:56:15.978377 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:15Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.021917 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.061507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.076141 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0464374-65ce-46f6-9e7e-79018b92aff4" containerID="aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d" exitCode=0 Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.076198 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerDied","Data":"aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d"} Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.100501 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.138980 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.178281 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.216699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.258315 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.299177 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.338424 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.378214 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.426423 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.441875 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.443898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.443940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.443950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.444009 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.474284 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.511695 4735 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.512143 4735 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.513594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.513658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.513671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.513694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.513708 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.528222 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.532086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.532121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.532132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.532150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.532164 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.541625 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.546356 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.549637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.549666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.549675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.549691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.549702 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.561010 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.564178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.564215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.564227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.564243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.564255 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.575742 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.578829 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.579405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.579429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.579438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.579452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.579462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.590564 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.590670 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.592129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.592165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.592176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.592195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.592207 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.616910 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.658747 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.694601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.694646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.694655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.694671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.694682 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.703641 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:16Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.796936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.796990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.797002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.797021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.797038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.894030 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.894227 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.894305 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.894400 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.895219 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:16 crc kubenswrapper[4735]: E1008 12:56:16.895405 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.899033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.899078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.899092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.899111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:16 crc kubenswrapper[4735]: I1008 12:56:16.899124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:16Z","lastTransitionTime":"2025-10-08T12:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.001498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.001562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.001572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.001592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.001604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.084392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.087113 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0464374-65ce-46f6-9e7e-79018b92aff4" containerID="895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4" exitCode=0 Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.087193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerDied","Data":"895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.102264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.103831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.103913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.103925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.103941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.103952 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.119139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.134580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.152218 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.174277 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.193080 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.206749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.206790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.206800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.206816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.206828 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.215123 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.230672 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.245861 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.257469 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.273309 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.284851 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.300087 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.309786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.309837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.309851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.309869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.309881 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.312980 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.323507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:17Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.413238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.413288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.413300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.413319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.413336 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.516541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.516589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.516599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.516617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.516629 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.619640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.619690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.619707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.619729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.619743 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.722066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.722111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.722119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.722134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.722144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.824132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.824167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.824176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.824191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.824200 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.926111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.926163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.926181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.926206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:17 crc kubenswrapper[4735]: I1008 12:56:17.926225 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:17Z","lastTransitionTime":"2025-10-08T12:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.029741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.030070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.030229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.030392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.030539 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.092351 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0464374-65ce-46f6-9e7e-79018b92aff4" containerID="196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78" exitCode=0 Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.092468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerDied","Data":"196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.117729 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.130655 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.132340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.132382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.132393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.132412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.132426 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.146233 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.159220 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.172094 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.201650 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.217871 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.232358 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.234620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.234654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.234671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.234689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.234698 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.247343 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.261204 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.271461 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.287324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.300218 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.323257 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.335296 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.336861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.336915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.336928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.336947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.336959 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.439783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.440144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.440153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.440167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.440178 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.462432 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.462574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.462626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.462649 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462732 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462783 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462805 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462817 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462828 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:26.462807373 +0000 UTC m=+36.271111917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462867 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:26.462851105 +0000 UTC m=+36.271155699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.462907 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.463005 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:26.462984448 +0000 UTC m=+36.271288992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.463251 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:56:26.463209443 +0000 UTC m=+36.271514067 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.543321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.543385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.543397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.543419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.543438 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.564006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.564212 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.564236 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.564249 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.564310 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:26.564295031 +0000 UTC m=+36.372599575 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.645994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.646027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.646036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.646050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.646059 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.748139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.748174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.748184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.748201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.748211 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.851185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.851246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.851257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.851272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.851284 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.893927 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.894072 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.894140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.894295 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.894561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:18 crc kubenswrapper[4735]: E1008 12:56:18.894686 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.953928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.953971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.953982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.953998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:18 crc kubenswrapper[4735]: I1008 12:56:18.954010 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:18Z","lastTransitionTime":"2025-10-08T12:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.056040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.056085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.056107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.056126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.056138 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.105167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.106617 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.106758 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.116112 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0464374-65ce-46f6-9e7e-79018b92aff4" containerID="0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129" exitCode=0 Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.116197 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerDied","Data":"0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.129674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.152128 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.154260 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.154274 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.158280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.158313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.158325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.158341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.158352 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.179594 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.193596 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.207613 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.221198 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.238666 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.255280 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.260896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.260957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.260987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.261012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.261032 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.291308 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.308779 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.337809 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.357334 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.364264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.364325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.364351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.364381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.364403 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.375136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.390576 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.403740 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.420425 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.432873 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.446166 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.467332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.467574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.467586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.467606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.467620 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.469678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.492113 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.515338 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.534329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.547250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.558450 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.570526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.570575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.570588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.570608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.570624 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.573056 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.588907 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.603760 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.615380 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.626681 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.643732 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:19Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.673722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.673758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.673766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.673780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.673789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.776289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.776349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.776361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.776376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.776387 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.878271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.878308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.878321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.878338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.878350 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.980669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.980695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.980702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.980716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:19 crc kubenswrapper[4735]: I1008 12:56:19.980726 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:19Z","lastTransitionTime":"2025-10-08T12:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.083068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.083096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.083103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.083116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.083125 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.122431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" event={"ID":"d0464374-65ce-46f6-9e7e-79018b92aff4","Type":"ContainerStarted","Data":"ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.122502 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.134256 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.151111 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.165393 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.176117 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.184811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.184844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.184857 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.184874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.184887 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.188595 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.199677 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.217620 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.232215 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.245511 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.262986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.275622 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.287356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.287388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.287397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.287412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.287422 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.288633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.300449 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.313604 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.329403 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.389522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.389586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.389598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.389615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.389626 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.492687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.492744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.492763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.492787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.492805 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.594882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.594968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.594981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.594999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.595012 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.697717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.697793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.697806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.697825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.697862 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.800535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.800588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.800603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.800624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.800637 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.894496 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.894568 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:20 crc kubenswrapper[4735]: E1008 12:56:20.894645 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.894659 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:20 crc kubenswrapper[4735]: E1008 12:56:20.894811 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:20 crc kubenswrapper[4735]: E1008 12:56:20.894917 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.903285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.903313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.903321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.903334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.903343 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:20Z","lastTransitionTime":"2025-10-08T12:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.912473 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.934363 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.948077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.958113 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.971538 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.983635 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:20 crc kubenswrapper[4735]: I1008 12:56:20.998948 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.005217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.005268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.005280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.005299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.005315 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.013803 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.032183 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.048109 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.079802 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.091575 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.107035 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.107640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.107682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.107693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.107707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.107717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.120873 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.124926 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.136971 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.210326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.210366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.210374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.210411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.210421 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.312468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.312552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.312563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.312580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.312589 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.415208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.415239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.415248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.415261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.415271 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.517281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.517324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.517334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.517350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.517359 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.619236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.619296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.619308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.619328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.619341 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.721755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.721798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.721814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.721843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.721860 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.824930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.824984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.824997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.825017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.825037 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.928344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.928495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.928515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.928541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:21 crc kubenswrapper[4735]: I1008 12:56:21.928559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:21Z","lastTransitionTime":"2025-10-08T12:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.031380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.031427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.031437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.031452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.031462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.108310 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.129662 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/0.log" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.131949 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725" exitCode=1 Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.132021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.132860 4735 scope.go:117] "RemoveContainer" containerID="39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.132916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.132982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.132994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.133005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.133015 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.133100 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.153447 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.175456 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.191826 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.205531 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.216394 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.227620 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.234739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.234774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.234785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.234802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.234815 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.238928 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.250635 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.263311 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.273422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.284963 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.296231 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.313757 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.324824 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.343402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.343436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.343448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.343465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.343491 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.346012 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:21Z\\\",\\\"message\\\":\\\"1.Node event handler 2 for removal\\\\nI1008 12:56:21.651845 5995 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 12:56:21.651860 5995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 12:56:21.651866 5995 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 12:56:21.651892 5995 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 12:56:21.651907 5995 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 12:56:21.651915 5995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 12:56:21.651925 5995 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 12:56:21.651932 5995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 12:56:21.652088 5995 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652149 5995 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 12:56:21.652319 5995 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652550 5995 factory.go:656] Stopping watch factory\\\\nI1008 12:56:21.652569 5995 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.359020 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.374905 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.386189 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.395885 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.404906 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.413979 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.423938 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.435159 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.446144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.446186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.446197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.446214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.446227 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.449546 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.461899 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.475354 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.488914 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.502404 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.517059 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:22Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.549230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.549280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.549290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.549304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.549314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.651508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.651539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.651545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.651559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.651569 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.754912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.754961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.754974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.754993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.755006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.857866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.857899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.857909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.857925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.857935 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.894138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.894177 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.894233 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:22 crc kubenswrapper[4735]: E1008 12:56:22.894281 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:22 crc kubenswrapper[4735]: E1008 12:56:22.894365 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:22 crc kubenswrapper[4735]: E1008 12:56:22.894437 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.960414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.960457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.960469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.960526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:22 crc kubenswrapper[4735]: I1008 12:56:22.960543 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:22Z","lastTransitionTime":"2025-10-08T12:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.062993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.063051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.063064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.063085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.063099 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.138754 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/1.log" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.140451 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/0.log" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.146978 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e" exitCode=1 Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.147029 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.147098 4735 scope.go:117] "RemoveContainer" containerID="39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.148035 4735 scope.go:117] "RemoveContainer" containerID="cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e" Oct 08 12:56:23 crc kubenswrapper[4735]: E1008 12:56:23.148389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.164417 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.165804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.165856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.165866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.165882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.165893 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.179766 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.194178 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.216350 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:21Z\\\",\\\"message\\\":\\\"1.Node event handler 2 for removal\\\\nI1008 12:56:21.651845 5995 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 12:56:21.651860 5995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 12:56:21.651866 5995 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 12:56:21.651892 5995 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 12:56:21.651907 5995 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 12:56:21.651915 5995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 12:56:21.651925 5995 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 12:56:21.651932 5995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 12:56:21.652088 5995 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652149 5995 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 12:56:21.652319 5995 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652550 5995 factory.go:656] Stopping watch factory\\\\nI1008 12:56:21.652569 5995 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.231563 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.261390 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.268653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.268719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.268742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.268774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.268797 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.278532 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.293543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.305728 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.318397 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.333875 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.355579 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.371024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.371061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.371069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.371083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.371093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.372273 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.385311 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.398638 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:23Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.474447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.474779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.474871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.474941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.475050 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.577278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.577579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.577761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.577845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.577912 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.680574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.680615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.680625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.680644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.680658 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.782687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.782723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.782734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.782749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.782760 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.885832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.885945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.885955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.885971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.885980 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.989315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.989379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.989394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.989418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:23 crc kubenswrapper[4735]: I1008 12:56:23.989432 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:23Z","lastTransitionTime":"2025-10-08T12:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.092648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.092714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.092733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.092760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.092779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.153760 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/1.log" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.195294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.195349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.195367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.195434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.195458 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.298299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.298337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.298344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.298358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.298370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.400774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.400849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.400870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.400900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.400921 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.504347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.504454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.504518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.504555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.504585 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.607749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.607816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.607835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.607862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.607896 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.710993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.711065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.711084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.711110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.711129 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.814633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.814677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.814689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.814705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.814716 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.894657 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.894656 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:24 crc kubenswrapper[4735]: E1008 12:56:24.894808 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:24 crc kubenswrapper[4735]: E1008 12:56:24.894856 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.894656 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:24 crc kubenswrapper[4735]: E1008 12:56:24.894960 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.917909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.917959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.917972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.917992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:24 crc kubenswrapper[4735]: I1008 12:56:24.918005 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:24Z","lastTransitionTime":"2025-10-08T12:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.021336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.021394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.021408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.021434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.021449 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.126680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.126764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.126781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.126810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.126833 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.166632 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk"] Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.167162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.169432 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.169595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.182524 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.197822 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.213304 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.229225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.229288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.229301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.229325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.229340 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.230919 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.235416 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d50458a5-b349-4384-9451-69dcb6f4e3d4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.235494 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kls7t\" (UniqueName: \"kubernetes.io/projected/d50458a5-b349-4384-9451-69dcb6f4e3d4-kube-api-access-kls7t\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.235522 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d50458a5-b349-4384-9451-69dcb6f4e3d4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.235663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d50458a5-b349-4384-9451-69dcb6f4e3d4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.243319 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.257671 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.277391 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.289733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.317935 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:21Z\\\",\\\"message\\\":\\\"1.Node event handler 2 for removal\\\\nI1008 12:56:21.651845 5995 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 12:56:21.651860 5995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 12:56:21.651866 5995 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 12:56:21.651892 5995 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 12:56:21.651907 5995 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 12:56:21.651915 5995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 12:56:21.651925 5995 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 12:56:21.651932 5995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 12:56:21.652088 5995 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652149 5995 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 12:56:21.652319 5995 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652550 5995 factory.go:656] Stopping watch factory\\\\nI1008 12:56:21.652569 5995 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.331728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.331793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.331806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.331824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.331839 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.336351 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kls7t\" (UniqueName: \"kubernetes.io/projected/d50458a5-b349-4384-9451-69dcb6f4e3d4-kube-api-access-kls7t\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.336444 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d50458a5-b349-4384-9451-69dcb6f4e3d4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.336579 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d50458a5-b349-4384-9451-69dcb6f4e3d4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.336674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d50458a5-b349-4384-9451-69dcb6f4e3d4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.337103 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.337435 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d50458a5-b349-4384-9451-69dcb6f4e3d4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.337653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d50458a5-b349-4384-9451-69dcb6f4e3d4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.345628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d50458a5-b349-4384-9451-69dcb6f4e3d4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.359699 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kls7t\" (UniqueName: \"kubernetes.io/projected/d50458a5-b349-4384-9451-69dcb6f4e3d4-kube-api-access-kls7t\") pod \"ovnkube-control-plane-749d76644c-d8pqk\" (UID: \"d50458a5-b349-4384-9451-69dcb6f4e3d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.364063 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.380895 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.399642 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.411116 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.422798 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.434488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.434543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.434554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.434572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.434587 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.441965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:25Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.480180 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" Oct 08 12:56:25 crc kubenswrapper[4735]: W1008 12:56:25.494521 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50458a5_b349_4384_9451_69dcb6f4e3d4.slice/crio-b9a4f8c4a68c299c9a94f8dbf5568fc6dd2769bcad3a066a25c9246ef49594f1 WatchSource:0}: Error finding container b9a4f8c4a68c299c9a94f8dbf5568fc6dd2769bcad3a066a25c9246ef49594f1: Status 404 returned error can't find the container with id b9a4f8c4a68c299c9a94f8dbf5568fc6dd2769bcad3a066a25c9246ef49594f1 Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.537555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.537930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.538053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.538149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.538227 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.642139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.642198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.642208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.642223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.642234 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.744946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.744996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.745008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.745030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.745044 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.847906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.847950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.847961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.847981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.847994 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.951234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.951290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.951303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.951322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:25 crc kubenswrapper[4735]: I1008 12:56:25.951338 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:25Z","lastTransitionTime":"2025-10-08T12:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.054531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.054579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.054589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.054606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.054616 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.157096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.157144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.157155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.157171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.157182 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.167444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" event={"ID":"d50458a5-b349-4384-9451-69dcb6f4e3d4","Type":"ContainerStarted","Data":"1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.167541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" event={"ID":"d50458a5-b349-4384-9451-69dcb6f4e3d4","Type":"ContainerStarted","Data":"29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.167563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" event={"ID":"d50458a5-b349-4384-9451-69dcb6f4e3d4","Type":"ContainerStarted","Data":"b9a4f8c4a68c299c9a94f8dbf5568fc6dd2769bcad3a066a25c9246ef49594f1"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.184228 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.208560 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.224152 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.238997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.252592 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.260428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.260470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.260505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.260522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.260535 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.266359 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.291953 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:21Z\\\",\\\"message\\\":\\\"1.Node event handler 2 for removal\\\\nI1008 12:56:21.651845 5995 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 12:56:21.651860 5995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 12:56:21.651866 5995 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 12:56:21.651892 5995 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 12:56:21.651907 5995 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 12:56:21.651915 5995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 12:56:21.651925 5995 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 12:56:21.651932 5995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 12:56:21.652088 5995 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652149 5995 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 12:56:21.652319 5995 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652550 5995 factory.go:656] Stopping watch factory\\\\nI1008 12:56:21.652569 5995 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.303753 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mldxj"] Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.304360 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.304449 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.306750 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.320295 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.332593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.343947 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.347624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk2wl\" (UniqueName: \"kubernetes.io/projected/771d7ef1-3d58-44eb-89b5-43a5da3e8091-kube-api-access-kk2wl\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.347662 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.357936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.362534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.362579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.362590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.362607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.362619 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.375242 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.389936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.403247 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.416405 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.430506 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.448122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk2wl\" (UniqueName: \"kubernetes.io/projected/771d7ef1-3d58-44eb-89b5-43a5da3e8091-kube-api-access-kk2wl\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.448164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.448293 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.448345 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:26.94833224 +0000 UTC m=+36.756636784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.449587 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:21Z\\\",\\\"message\\\":\\\"1.Node event handler 2 for removal\\\\nI1008 12:56:21.651845 5995 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 12:56:21.651860 5995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 12:56:21.651866 5995 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 12:56:21.651892 5995 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 12:56:21.651907 5995 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 12:56:21.651915 5995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 12:56:21.651925 5995 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 12:56:21.651932 5995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 12:56:21.652088 5995 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652149 5995 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 12:56:21.652319 5995 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652550 5995 factory.go:656] Stopping watch factory\\\\nI1008 12:56:21.652569 5995 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.465025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.465063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.465074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.465093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.465106 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.470787 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.471743 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk2wl\" (UniqueName: \"kubernetes.io/projected/771d7ef1-3d58-44eb-89b5-43a5da3e8091-kube-api-access-kk2wl\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.495470 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.513876 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.525453 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.536984 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.549690 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.549856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.549900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.549932 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:56:42.54990889 +0000 UTC m=+52.358213434 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.549988 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550009 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550010 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550022 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.550041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550068 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:42.550055524 +0000 UTC m=+52.358360068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550086 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:42.550079185 +0000 UTC m=+52.358383729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550134 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.550165 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:42.550158617 +0000 UTC m=+52.358463161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.550840 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.563560 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.566823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.566874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.566890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.566912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.566929 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.577796 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.588717 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.602539 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.616569 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.629279 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.639231 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.648798 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.651012 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.651249 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.651291 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.651332 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.651390 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:42.651371268 +0000 UTC m=+52.459675882 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.657940 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.669758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.669793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.669803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.669834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.669844 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.773257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.773313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.773325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.773345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.773356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.876312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.876360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.876375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.876395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.876410 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.894054 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.894104 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.894058 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.894321 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.894230 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.894547 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.954675 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.954861 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.954986 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:27.95496158 +0000 UTC m=+37.763266204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.961144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.961279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.961307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.961351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.961381 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.977263 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.981961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.982022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.982037 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.982061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:26 crc kubenswrapper[4735]: I1008 12:56:26.982076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:26Z","lastTransitionTime":"2025-10-08T12:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:26 crc kubenswrapper[4735]: E1008 12:56:26.997164 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:26Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.001049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.001089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.001100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.001115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.001125 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.014761 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:27Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.020995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.021062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.021076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.021605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.021634 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.043357 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:27Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.048777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.048824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.048841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.048867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.048886 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.069102 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:27Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.069334 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.071103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.071166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.071184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.071211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.071229 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.173468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.173591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.173618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.173653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.173679 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.277618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.277674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.277687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.277705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.277717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.380043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.380079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.380090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.380105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.380117 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.482593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.482644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.482653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.482670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.482680 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.584573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.584617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.584624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.584639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.584648 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.687720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.687777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.687811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.687882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.687908 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.791626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.791701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.791722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.791752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.791777 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.893791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.893982 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.895472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.895554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.895572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.895595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.895615 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.965658 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.965955 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:27 crc kubenswrapper[4735]: E1008 12:56:27.966090 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:29.966063247 +0000 UTC m=+39.774367781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.999379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.999550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.999583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.999617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:27 crc kubenswrapper[4735]: I1008 12:56:27.999641 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:27Z","lastTransitionTime":"2025-10-08T12:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.102961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.103003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.103013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.103030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.103052 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.206994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.207033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.207042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.207056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.207068 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.309632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.309684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.309702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.309737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.309774 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.412733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.412773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.412808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.412828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.412846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.515798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.515867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.515889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.515918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.515941 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.618153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.618210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.618224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.618238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.618249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.721292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.721337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.721350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.721368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.721386 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.825004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.827803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.827876 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.827916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.827933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.894594 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.894639 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:28 crc kubenswrapper[4735]: E1008 12:56:28.894841 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.894894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:28 crc kubenswrapper[4735]: E1008 12:56:28.895130 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:28 crc kubenswrapper[4735]: E1008 12:56:28.895327 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.930685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.930729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.930741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.930761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:28 crc kubenswrapper[4735]: I1008 12:56:28.930774 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:28Z","lastTransitionTime":"2025-10-08T12:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.033181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.033221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.033233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.033252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.033265 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.136648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.136704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.136724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.136748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.136766 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.239751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.239821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.239835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.239851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.239863 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.343238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.343322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.343347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.343379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.343400 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.446553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.446599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.446612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.446631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.446646 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.549739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.549811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.549835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.549865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.549888 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.653188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.653251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.653268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.653292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.653307 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.756887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.757003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.757022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.757048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.757066 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.860438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.860561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.860587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.860619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.860641 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.893900 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:29 crc kubenswrapper[4735]: E1008 12:56:29.894184 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.963734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.963798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.963809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.963824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.963835 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:29Z","lastTransitionTime":"2025-10-08T12:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:29 crc kubenswrapper[4735]: I1008 12:56:29.989249 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:29 crc kubenswrapper[4735]: E1008 12:56:29.989524 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:29 crc kubenswrapper[4735]: E1008 12:56:29.989646 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:33.989612243 +0000 UTC m=+43.797916827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.067422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.067499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.067511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.067532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.067545 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.171610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.171716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.171792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.171822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.171846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.274503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.274544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.274553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.274567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.274579 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.377316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.377370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.377387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.377414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.377434 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.480805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.480876 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.480904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.480931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.480954 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.583565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.583633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.583650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.583679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.583697 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.687042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.687119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.687134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.687161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.687176 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.790008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.790053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.790062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.790076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.790088 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.892788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.892833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.892845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.892862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.892874 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.893808 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.893808 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:30 crc kubenswrapper[4735]: E1008 12:56:30.893981 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.893809 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:30 crc kubenswrapper[4735]: E1008 12:56:30.894139 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:30 crc kubenswrapper[4735]: E1008 12:56:30.894255 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.910013 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:30Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.924900 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:30Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.934514 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:30Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.959617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39ce2ded565e91825f07f1edf0cf5e2de2a454e4d46cc8f1fd4c4d0a6ce47725\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:21Z\\\",\\\"message\\\":\\\"1.Node event handler 2 for removal\\\\nI1008 12:56:21.651845 5995 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 12:56:21.651860 5995 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 12:56:21.651866 5995 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 12:56:21.651892 5995 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 12:56:21.651907 5995 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 12:56:21.651915 5995 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 12:56:21.651925 5995 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 12:56:21.651932 5995 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 12:56:21.652088 5995 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652149 5995 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 12:56:21.652319 5995 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 12:56:21.652550 5995 factory.go:656] Stopping watch factory\\\\nI1008 12:56:21.652569 5995 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:30Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.974997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:30Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.994567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.994612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.994626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.994647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:30 crc kubenswrapper[4735]: I1008 12:56:30.994663 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:30Z","lastTransitionTime":"2025-10-08T12:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.000443 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:30Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.021338 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.038431 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.053357 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.065144 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.080184 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.096746 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.097018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.097077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.097096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.097118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.097134 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.114062 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.128920 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.142155 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.160167 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.173858 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:31Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.199189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.199223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.199231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.199245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.199253 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.302878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.302978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.303002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.303031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.303054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.405679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.405741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.405756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.405775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.405786 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.508713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.508794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.508805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.508822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.508832 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.612118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.612186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.612202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.612274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.612338 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.715275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.715326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.715339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.715359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.715373 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.818724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.818763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.818776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.818792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.818805 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.894555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:31 crc kubenswrapper[4735]: E1008 12:56:31.894765 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.921099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.921147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.921158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.921178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:31 crc kubenswrapper[4735]: I1008 12:56:31.921192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:31Z","lastTransitionTime":"2025-10-08T12:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.023636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.023688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.023700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.023720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.023734 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.125974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.126018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.126035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.126056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.126071 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.228273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.228342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.228351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.228367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.228377 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.330643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.330696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.330708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.330726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.330739 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.432551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.432602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.432615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.432632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.432644 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.535689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.535781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.535810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.535835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.535852 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.638142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.638187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.638198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.638217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.638230 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.741082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.741150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.741174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.741203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.741223 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.844144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.844190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.844203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.844222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.844234 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.894848 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.894953 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.894865 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:32 crc kubenswrapper[4735]: E1008 12:56:32.895112 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:32 crc kubenswrapper[4735]: E1008 12:56:32.895936 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:32 crc kubenswrapper[4735]: E1008 12:56:32.896372 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.946747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.946828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.946849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.946877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:32 crc kubenswrapper[4735]: I1008 12:56:32.946899 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:32Z","lastTransitionTime":"2025-10-08T12:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.049751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.049795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.049806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.049821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.049832 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.152917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.152991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.153010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.153034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.153055 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.255547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.255588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.255600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.255617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.255629 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.359057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.359129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.359155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.359186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.359208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.461714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.461774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.461788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.461809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.461823 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.564455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.564502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.564513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.564529 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.564541 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.667463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.667569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.667592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.667621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.667642 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.770445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.770535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.770551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.770573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.770590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.873093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.873128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.873138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.873155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.873167 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.894662 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:33 crc kubenswrapper[4735]: E1008 12:56:33.894831 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.976051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.976109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.976129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.976152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:33 crc kubenswrapper[4735]: I1008 12:56:33.976168 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:33Z","lastTransitionTime":"2025-10-08T12:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.034801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:34 crc kubenswrapper[4735]: E1008 12:56:34.035000 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:34 crc kubenswrapper[4735]: E1008 12:56:34.035103 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:42.035073334 +0000 UTC m=+51.843377918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.078936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.078997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.079007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.079026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.079039 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.182225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.182267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.182276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.182292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.182302 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.284716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.284773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.284792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.284814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.284834 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.387149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.387203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.387216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.387236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.387254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.490013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.490054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.490065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.490080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.490090 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.592328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.592373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.592381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.592395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.592405 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.695148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.695215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.695238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.695267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.695297 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.797829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.797872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.797882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.797900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.797912 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.894436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.894626 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.894530 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:34 crc kubenswrapper[4735]: E1008 12:56:34.894797 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:34 crc kubenswrapper[4735]: E1008 12:56:34.894938 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:34 crc kubenswrapper[4735]: E1008 12:56:34.895122 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.901026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.901060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.901068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.901082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:34 crc kubenswrapper[4735]: I1008 12:56:34.901096 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:34Z","lastTransitionTime":"2025-10-08T12:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.004902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.004972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.004990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.005016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.005035 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.108103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.108357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.108446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.108579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.108706 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.210535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.210907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.210993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.211093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.211189 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.314834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.314879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.314898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.314924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.314943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.417820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.417964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.417979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.417996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.418009 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.521057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.521216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.521241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.521266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.521283 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.624102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.624143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.624152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.624183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.624195 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.726835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.726959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.726985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.727015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.727033 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.830679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.830840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.830871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.830903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.830926 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.894368 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:35 crc kubenswrapper[4735]: E1008 12:56:35.894675 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.933548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.933602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.933616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.933635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:35 crc kubenswrapper[4735]: I1008 12:56:35.933654 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:35Z","lastTransitionTime":"2025-10-08T12:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.037017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.037082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.037103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.037126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.037144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.141192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.141668 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.141807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.141964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.142113 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.245376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.245445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.245471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.245544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.245567 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.348636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.348684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.348701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.348724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.348748 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.451954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.452004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.452021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.452040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.452051 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.555437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.555583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.555618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.555653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.555675 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.659110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.659175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.659192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.659218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.659236 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.762356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.762413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.762430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.762471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.762519 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.865337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.865527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.865549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.865575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.865595 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.894534 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.894616 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.894745 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:36 crc kubenswrapper[4735]: E1008 12:56:36.894742 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:36 crc kubenswrapper[4735]: E1008 12:56:36.894887 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:36 crc kubenswrapper[4735]: E1008 12:56:36.895045 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.969039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.969102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.969114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.969136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:36 crc kubenswrapper[4735]: I1008 12:56:36.969152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:36Z","lastTransitionTime":"2025-10-08T12:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.072963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.073023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.073040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.073070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.073089 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.175763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.175847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.175869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.175898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.175916 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.279427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.279514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.279532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.279557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.279575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.323431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.323513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.323534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.323572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.323609 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.346892 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.352569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.352627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.352642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.352667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.352683 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.365868 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.369606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.369652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.369663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.369680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.369694 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.381128 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.384042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.384083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.384094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.384111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.384125 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.394827 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.398160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.398270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.398333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.398399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.398516 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.409236 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.409533 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.410647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.410669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.410678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.410689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.410697 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.513571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.513612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.513623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.513639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.513651 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.616340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.616761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.616898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.617030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.617204 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.720197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.720253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.720272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.720297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.720318 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.822638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.822680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.822696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.822713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.822727 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.894138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:37 crc kubenswrapper[4735]: E1008 12:56:37.894318 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.895117 4735 scope.go:117] "RemoveContainer" containerID="cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.909393 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.925463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.925513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.925545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.925564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.925576 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:37Z","lastTransitionTime":"2025-10-08T12:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.928668 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.949681 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.977906 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:37 crc kubenswrapper[4735]: I1008 12:56:37.993729 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:37Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.005020 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.016914 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.028934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.029197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.029317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.029402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.029470 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.031295 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.051438 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.067775 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.078580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.090943 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.101255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.111977 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.127776 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.131427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.131454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.131463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.131493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.131507 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.142958 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.155320 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.209848 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/1.log" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.212829 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.212972 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.228065 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.234008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.234053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.234063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.234081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.234093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.241891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.257235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.270951 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.287083 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.298936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.312526 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.322740 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.333633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.336197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.336242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.336259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.336281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.336303 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.346339 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.387639 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.409003 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.439156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.439209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.439225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.439246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.439261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.442719 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.467699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.483221 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.502467 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.523378 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:38Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.542103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.542139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.542148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.542161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.542170 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.646983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.647033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.647148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.647228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.647246 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.724464 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.749978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.750031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.750044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.750065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.750079 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.852935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.852967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.852977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.852991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.853000 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.894011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.894065 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:38 crc kubenswrapper[4735]: E1008 12:56:38.894156 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.894011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:38 crc kubenswrapper[4735]: E1008 12:56:38.894403 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:38 crc kubenswrapper[4735]: E1008 12:56:38.894458 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.956235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.956313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.956334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.956363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:38 crc kubenswrapper[4735]: I1008 12:56:38.956383 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:38Z","lastTransitionTime":"2025-10-08T12:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.059514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.059604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.059628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.059653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.059671 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.163409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.163463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.163506 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.163528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.163546 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.219973 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/2.log" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.221231 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/1.log" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.225392 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357" exitCode=1 Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.225437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.225567 4735 scope.go:117] "RemoveContainer" containerID="cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.226976 4735 scope.go:117] "RemoveContainer" containerID="591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357" Oct 08 12:56:39 crc kubenswrapper[4735]: E1008 12:56:39.227318 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.253273 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.266634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.266695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.266712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.266735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.266752 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.274055 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.295767 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.310068 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.321780 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.333303 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.357254 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.369032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.369071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.369082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.369098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.369109 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.374426 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.395271 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.411910 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.424660 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.436404 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.448360 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.463697 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.471034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.471078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.471092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.471110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.471123 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.477502 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.490805 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.502021 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.573962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.574016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.574028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.574047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.574060 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.676439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.676515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.676527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.676549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.676565 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.779610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.779673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.779686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.779710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.779724 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.867052 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.879268 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.883005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.883077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.883099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.883124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.883143 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.890631 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.894046 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:39 crc kubenswrapper[4735]: E1008 12:56:39.894340 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.927822 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.947729 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.960508 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.972224 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.984050 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:39Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.985366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.985422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.985439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.985462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:39 crc kubenswrapper[4735]: I1008 12:56:39.985504 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:39Z","lastTransitionTime":"2025-10-08T12:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.009339 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cee884b3ae862e7e9a856096614d0ff4e3386e9639fbaa8cdab587ea4f0a109e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:23Z\\\",\\\"message\\\":\\\"ions generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:23.077887 6141 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.026814 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.045694 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.060603 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.073136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.084355 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.088294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.088328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.088339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.088358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.088370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.098004 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.111442 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.123602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.134917 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.148680 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.191114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.191171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.191187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.191210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.191230 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.231856 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/2.log" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.237156 4735 scope.go:117] "RemoveContainer" containerID="591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357" Oct 08 12:56:40 crc kubenswrapper[4735]: E1008 12:56:40.237439 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.254719 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.272066 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.287438 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.293381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.293424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.293436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.293457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.293488 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.304116 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.341803 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.356612 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.376252 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.390424 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.395936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.395973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.395985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.396003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.396014 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.413198 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.424512 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.436920 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.448606 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.466518 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.486004 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.498517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.498558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.498570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.498627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.498640 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.502712 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.515022 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.526864 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.537277 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.600811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.600880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.600901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.600925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.600943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.704518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.704570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.704579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.704593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.704605 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.807770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.807814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.807823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.807841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.807852 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.894139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:40 crc kubenswrapper[4735]: E1008 12:56:40.894338 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.894176 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.894399 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:40 crc kubenswrapper[4735]: E1008 12:56:40.894502 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:40 crc kubenswrapper[4735]: E1008 12:56:40.894627 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.910400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.910457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.910514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.910550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.910575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:40Z","lastTransitionTime":"2025-10-08T12:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.917858 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.937589 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.951379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.962370 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.974693 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:40 crc kubenswrapper[4735]: I1008 12:56:40.986593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:40Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.013289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.013353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.013370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.013394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.013411 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.022542 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.038693 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.061976 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.075209 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.087662 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.098860 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.111230 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.115181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.115240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.115259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.115301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.115341 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.123841 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.135895 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.151799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.163980 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.173572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:41Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.217912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.217947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.217958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.217974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.217985 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.320771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.320813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.320824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.320839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.320850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.423663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.423740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.423763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.423792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.423812 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.527013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.527096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.527181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.527217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.527242 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.629737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.629779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.629791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.629807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.629819 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.732702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.732769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.732788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.732813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.732832 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.836727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.836807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.836831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.836864 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.836888 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.893831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:41 crc kubenswrapper[4735]: E1008 12:56:41.893979 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.940107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.940178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.940197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.940218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:41 crc kubenswrapper[4735]: I1008 12:56:41.940236 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:41Z","lastTransitionTime":"2025-10-08T12:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.042163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.042204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.042216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.042237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.042250 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.124356 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.124529 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.124585 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:56:58.124569712 +0000 UTC m=+67.932874266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.145138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.145209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.145220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.145236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.145249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.247724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.247769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.247782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.247806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.247823 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.350552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.350619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.350640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.350665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.350684 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.453400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.453446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.453461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.453517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.453535 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.556395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.556439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.556451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.556470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.556508 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.629383 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.629556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629601 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:57:14.629567201 +0000 UTC m=+84.437871765 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629641 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629705 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:57:14.629686974 +0000 UTC m=+84.437991538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.629740 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.629773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629871 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629894 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629940 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:57:14.62993088 +0000 UTC m=+84.438235434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.629887 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.630003 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.630039 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:57:14.630029313 +0000 UTC m=+84.438333877 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.659855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.659906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.659918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.659933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.659943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.731151 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.731311 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.731329 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.731343 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.731387 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:57:14.731374767 +0000 UTC m=+84.539679311 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.762260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.762684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.762701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.762723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.762739 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.864865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.864907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.864916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.864931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.864943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.893739 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.893882 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.894246 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.894324 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.894388 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:42 crc kubenswrapper[4735]: E1008 12:56:42.894469 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.966895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.966925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.966973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.966988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:42 crc kubenswrapper[4735]: I1008 12:56:42.966998 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:42Z","lastTransitionTime":"2025-10-08T12:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.069557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.069621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.069635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.069651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.069660 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.171855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.171898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.171907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.171921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.171930 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.273813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.273851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.273862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.273921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.273936 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.376615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.376682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.376696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.376715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.376730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.479883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.479956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.479987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.480014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.480034 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.584381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.584423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.584432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.584444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.584454 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.686821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.686879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.686895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.686916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.686933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.790835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.790930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.790947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.791400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.791448 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.894895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.894981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.894922 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.895007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.895040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.895062 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:43 crc kubenswrapper[4735]: E1008 12:56:43.895192 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.998589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.998650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.998664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.998683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:43 crc kubenswrapper[4735]: I1008 12:56:43.998696 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:43Z","lastTransitionTime":"2025-10-08T12:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.102369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.102446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.102464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.102838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.103061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.206620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.206670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.206684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.206704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.206719 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.309380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.309432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.309454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.309503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.309523 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.412209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.412277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.412298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.412322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.412339 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.515814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.515881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.515903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.515932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.515957 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.618774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.618821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.618829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.618845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.618854 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.721104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.721154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.721166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.721185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.721197 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.823836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.823910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.823937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.823968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.823991 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.894125 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.894178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:44 crc kubenswrapper[4735]: E1008 12:56:44.894368 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.894413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:44 crc kubenswrapper[4735]: E1008 12:56:44.894624 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:44 crc kubenswrapper[4735]: E1008 12:56:44.894791 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.927051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.927093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.927108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.927126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:44 crc kubenswrapper[4735]: I1008 12:56:44.927138 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:44Z","lastTransitionTime":"2025-10-08T12:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.030575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.030618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.030629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.030645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.030660 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.134196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.135130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.135176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.135200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.135214 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.238907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.238975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.238993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.239016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.239034 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.342594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.342665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.342685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.342713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.342739 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.446467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.446603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.446626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.446655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.446676 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.550176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.550257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.550287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.550324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.550352 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.653454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.653547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.653565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.653589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.653608 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.756709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.756777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.756795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.756821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.756844 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.861732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.861815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.861842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.861877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.861912 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.893646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:45 crc kubenswrapper[4735]: E1008 12:56:45.893824 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.964681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.964752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.964778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.964808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:45 crc kubenswrapper[4735]: I1008 12:56:45.964831 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:45Z","lastTransitionTime":"2025-10-08T12:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.067967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.068019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.068037 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.068066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.068089 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.170949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.170993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.171002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.171014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.171023 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.274126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.274186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.274204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.274230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.274248 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.376778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.376831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.376845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.376866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.376879 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.478941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.478976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.478988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.479005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.479016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.582120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.582195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.582216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.582245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.582268 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.685299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.685372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.685393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.685422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.685445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.787851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.787926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.787950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.787979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.788003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.891737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.891787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.891800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.891823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.891836 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.893674 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.893778 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.894051 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:46 crc kubenswrapper[4735]: E1008 12:56:46.894037 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:46 crc kubenswrapper[4735]: E1008 12:56:46.894153 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:46 crc kubenswrapper[4735]: E1008 12:56:46.894257 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.994765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.994811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.994828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.994850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:46 crc kubenswrapper[4735]: I1008 12:56:46.994867 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:46Z","lastTransitionTime":"2025-10-08T12:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.098565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.098648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.098666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.098695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.098713 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.201804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.201870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.201887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.201910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.201924 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.304535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.304635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.304659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.304690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.304714 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.407842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.407922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.407946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.407970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.407995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.511102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.511181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.511209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.511240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.511262 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.613942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.614004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.614017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.614033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.614046 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.661069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.661120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.661137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.661154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.661165 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.681622 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:47Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.687865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.687926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.687940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.687962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.687983 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.706325 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:47Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.710949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.711001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.711016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.711036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.711053 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.724198 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:47Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.728727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.728769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.728786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.728804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.728817 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.744144 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:47Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.748512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.748557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.748572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.748588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.748600 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.762516 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:47Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.762688 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.764634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.764667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.764680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.764696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.764707 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.866826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.866870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.866882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.866899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.866911 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.894338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:47 crc kubenswrapper[4735]: E1008 12:56:47.894557 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.970704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.970768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.970790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.970823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:47 crc kubenswrapper[4735]: I1008 12:56:47.970846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:47Z","lastTransitionTime":"2025-10-08T12:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.073139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.073207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.073226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.073253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.073272 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.176188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.176244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.176253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.176272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.176282 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.279208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.279256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.279269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.279293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.279307 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.382847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.382920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.382947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.382980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.383001 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.485241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.485294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.485305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.485323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.485335 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.587608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.587658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.587670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.587686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.587698 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.691267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.691333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.691350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.691377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.691392 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.794577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.794630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.794649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.794676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.794696 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.894615 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:48 crc kubenswrapper[4735]: E1008 12:56:48.894744 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.894617 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.894892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:48 crc kubenswrapper[4735]: E1008 12:56:48.895168 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:48 crc kubenswrapper[4735]: E1008 12:56:48.895219 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.896914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.896957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.896968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.897027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.897042 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.999770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.999838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.999870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:48 crc kubenswrapper[4735]: I1008 12:56:48.999900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:48.999922 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:48Z","lastTransitionTime":"2025-10-08T12:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.107533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.107584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.107609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.107633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.107651 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.210669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.210803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.210827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.210852 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.210870 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.313310 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.313366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.313378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.313397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.313411 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.417396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.417526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.417558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.417591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.417614 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.520009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.520062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.520075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.520095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.520108 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.623421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.623539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.623565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.623598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.623622 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.726240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.726736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.726963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.727187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.727393 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.830101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.830192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.830213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.830239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.830257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.893995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:49 crc kubenswrapper[4735]: E1008 12:56:49.894211 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.933688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.933759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.933778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.933805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:49 crc kubenswrapper[4735]: I1008 12:56:49.933826 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:49Z","lastTransitionTime":"2025-10-08T12:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.037036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.037126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.037150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.037183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.037208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.140256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.140341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.140362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.140388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.140409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.244566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.244665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.244683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.244708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.244727 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.348229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.348321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.348347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.348559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.348620 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.451969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.452052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.452064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.452082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.452094 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.554819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.554893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.554911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.554933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.554952 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.658922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.658996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.659034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.659064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.659086 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.762184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.762237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.762249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.762267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.762280 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.865137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.865224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.865250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.865283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.865307 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.893687 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.893851 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:50 crc kubenswrapper[4735]: E1008 12:56:50.894103 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.894161 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:50 crc kubenswrapper[4735]: E1008 12:56:50.894366 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:50 crc kubenswrapper[4735]: E1008 12:56:50.894463 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.915903 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:50Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.932515 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:50Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.950383 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:50Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.966645 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:50Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.968535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.968611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.968627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.968644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.968678 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:50Z","lastTransitionTime":"2025-10-08T12:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:50 crc kubenswrapper[4735]: I1008 12:56:50.982887 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:50Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.006275 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.023507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.034857 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.048088 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.065334 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.071407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.071449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.071458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.071526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.071544 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.081294 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.107436 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.122988 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.134865 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.145491 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.154422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.163365 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.173578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.173613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.173623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.173640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.173653 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.183700 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:51Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.275973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.276009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.276017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.276031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.276041 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.379188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.379282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.379305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.379335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.379358 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.482638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.482686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.482706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.482730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.482747 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.586272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.586331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.586351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.586373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.586386 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.689299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.689334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.689347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.689363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.689373 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.791969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.792007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.792017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.792033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.792044 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.893868 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:51 crc kubenswrapper[4735]: E1008 12:56:51.894125 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.895227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.895309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.895339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.895373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.895398 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.998273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.998323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.998335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.998353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:51 crc kubenswrapper[4735]: I1008 12:56:51.998366 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:51Z","lastTransitionTime":"2025-10-08T12:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.101623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.101680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.101699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.101729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.101747 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.205116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.205166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.205183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.205199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.205212 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.308237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.308299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.308319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.308341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.308359 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.411462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.411537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.411553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.411575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.411594 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.514511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.514574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.514590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.514611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.514625 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.617859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.617917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.617938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.617965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.617983 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.721465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.721577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.721595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.721623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.721640 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.824036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.824093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.824105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.824123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.824142 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.893830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.893842 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:52 crc kubenswrapper[4735]: E1008 12:56:52.894013 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.894029 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:52 crc kubenswrapper[4735]: E1008 12:56:52.894205 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:52 crc kubenswrapper[4735]: E1008 12:56:52.894327 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.895382 4735 scope.go:117] "RemoveContainer" containerID="591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357" Oct 08 12:56:52 crc kubenswrapper[4735]: E1008 12:56:52.895656 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.927212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.927270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.927293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.927320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:52 crc kubenswrapper[4735]: I1008 12:56:52.927343 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:52Z","lastTransitionTime":"2025-10-08T12:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.031423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.031508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.031526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.031551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.031571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.133992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.134035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.134044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.134058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.134069 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.236320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.236366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.236376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.236395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.236408 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.339440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.339553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.339579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.339611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.339634 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.443201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.443269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.443289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.443316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.443339 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.546720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.546787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.546800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.546845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.546861 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.650079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.650114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.650123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.650136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.650146 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.752174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.752228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.752237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.752250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.752260 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.855753 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.855828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.855852 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.855880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.855902 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.893632 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:53 crc kubenswrapper[4735]: E1008 12:56:53.893803 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.959353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.959402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.959413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.959431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:53 crc kubenswrapper[4735]: I1008 12:56:53.959445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:53Z","lastTransitionTime":"2025-10-08T12:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.062301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.062364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.062382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.062407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.062427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.165614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.165670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.165688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.165714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.165733 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.268341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.268411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.268436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.268460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.268516 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.370731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.370785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.370800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.370827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.370840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.472858 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.472922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.472934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.472950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.472963 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.575046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.575083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.575094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.575113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.575124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.677426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.677494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.677507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.677523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.677534 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.780502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.780557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.780568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.780585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.780600 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.883400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.883445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.883460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.883504 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.883523 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.894337 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:54 crc kubenswrapper[4735]: E1008 12:56:54.894511 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.894750 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.894851 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:54 crc kubenswrapper[4735]: E1008 12:56:54.894990 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:54 crc kubenswrapper[4735]: E1008 12:56:54.895091 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.986008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.986046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.986055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.986069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:54 crc kubenswrapper[4735]: I1008 12:56:54.986081 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:54Z","lastTransitionTime":"2025-10-08T12:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.088223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.088268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.088280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.088297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.088309 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.190683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.190721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.190729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.190749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.190761 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.292583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.292625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.292648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.292666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.292675 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.398456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.398527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.398542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.398559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.398571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.500779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.500841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.500858 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.500884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.500904 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.603809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.603860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.603869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.603885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.603896 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.706277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.706312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.706330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.706347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.706356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.808950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.808979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.808988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.809001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.809022 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.894311 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:55 crc kubenswrapper[4735]: E1008 12:56:55.894496 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.911971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.912085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.912094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.912108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:55 crc kubenswrapper[4735]: I1008 12:56:55.912116 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:55Z","lastTransitionTime":"2025-10-08T12:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.014070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.014110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.014119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.014134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.014144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.116370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.116408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.116418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.116434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.116445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.218926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.218957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.218965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.218978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.218987 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.321421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.321455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.321464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.321493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.321507 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.424677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.425886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.425958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.426024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.426100 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.528824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.528856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.528866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.528881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.528914 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.631163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.631375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.631535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.631700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.631850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.733872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.733919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.733928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.733943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.733953 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.836497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.836536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.836547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.836562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.836573 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.894451 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.894511 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:56 crc kubenswrapper[4735]: E1008 12:56:56.894587 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:56 crc kubenswrapper[4735]: E1008 12:56:56.894758 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.894896 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:56 crc kubenswrapper[4735]: E1008 12:56:56.895078 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.939077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.939109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.939118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.939131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:56 crc kubenswrapper[4735]: I1008 12:56:56.939142 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:56Z","lastTransitionTime":"2025-10-08T12:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.041700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.041744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.041755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.041771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.041784 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.147590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.147653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.147675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.147702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.147723 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.249886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.249970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.249994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.250009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.250020 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.352376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.352413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.352425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.352442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.352453 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.454690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.454719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.454728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.454740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.454749 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.557729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.557770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.557783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.557799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.557812 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.661009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.661049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.661057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.661072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.661083 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.763823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.763865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.763877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.763894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.763908 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.866875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.866992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.867007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.867023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.867070 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.894546 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:57 crc kubenswrapper[4735]: E1008 12:56:57.894717 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.969795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.969856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.969868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.969913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:57 crc kubenswrapper[4735]: I1008 12:56:57.969931 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:57Z","lastTransitionTime":"2025-10-08T12:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.005885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.005934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.005952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.005977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.005995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.028463 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:58Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.033950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.033982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.033991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.034003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.034012 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.054273 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:58Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.059116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.059146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.059155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.059172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.059181 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.075936 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:58Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.080024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.080074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.080087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.080106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.080119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.093804 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:58Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.098030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.098069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.098078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.098090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.098099 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.108636 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:56:58Z is after 2025-08-24T17:21:41Z" Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.108762 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.111161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.111214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.111223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.111245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.111257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.213011 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.213168 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.213222 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:57:30.213209285 +0000 UTC m=+100.021513829 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.213744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.213776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.213788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.213806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.213818 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.315317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.315352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.315359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.315374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.315386 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.418011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.418055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.418065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.418079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.418089 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.520610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.520644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.520654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.520669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.520681 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.623373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.623412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.623420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.623433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.623441 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.726379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.726440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.726451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.726490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.726506 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.829868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.829930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.829949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.829984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.830268 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.894654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.894785 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.895028 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.895126 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.895425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:56:58 crc kubenswrapper[4735]: E1008 12:56:58.895527 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.932353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.932411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.932423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.932443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:58 crc kubenswrapper[4735]: I1008 12:56:58.932456 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:58Z","lastTransitionTime":"2025-10-08T12:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.035124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.035195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.035213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.035238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.035258 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.138082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.138135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.138149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.138166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.138177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.240671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.240729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.240743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.240766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.240783 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.343327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.343372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.343386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.343406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.343422 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.445420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.445520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.445541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.445570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.445608 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.547955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.548007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.548025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.548050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.548071 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.651088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.651415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.651426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.651443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.651455 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.754009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.754058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.754069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.754089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.754100 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.857343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.857469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.857512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.857538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.857555 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.894144 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:56:59 crc kubenswrapper[4735]: E1008 12:56:59.894726 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.960317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.960356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.960366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.960382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:56:59 crc kubenswrapper[4735]: I1008 12:56:59.960392 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:56:59Z","lastTransitionTime":"2025-10-08T12:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.063432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.063500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.063517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.063538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.063551 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.165956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.166013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.166025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.166043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.166054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.268459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.268532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.268547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.268564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.268577 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.308455 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/0.log" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.308543 4735 generic.go:334] "Generic (PLEG): container finished" podID="ef6f0aff-a0e5-477a-9392-71bc49bfa7aa" containerID="33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515" exitCode=1 Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.308581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerDied","Data":"33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.309054 4735 scope.go:117] "RemoveContainer" containerID="33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.341546 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.354868 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.371648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.371706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.371719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.371742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.371756 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.376569 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.398180 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.412314 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.426288 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.438889 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.453572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.467793 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.474316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.474339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.474348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.474362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.474370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.482458 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.495263 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.507457 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.521600 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.535297 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.553866 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.568428 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.576540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.576590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.576606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.576628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.576642 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.581628 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.593803 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.679176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.679213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.679221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.679280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.679289 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.782005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.782071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.782084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.782111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.782126 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.884389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.884422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.884433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.884449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.884460 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.894258 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:00 crc kubenswrapper[4735]: E1008 12:57:00.894374 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.894604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:00 crc kubenswrapper[4735]: E1008 12:57:00.894711 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.894727 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:00 crc kubenswrapper[4735]: E1008 12:57:00.894844 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.907214 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.908762 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.936225 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.955268 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.975347 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.996905 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.996941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.996952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.996967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.996978 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:00Z","lastTransitionTime":"2025-10-08T12:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:00 crc kubenswrapper[4735]: I1008 12:57:00.998356 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:00Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.025499 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.049346 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.065385 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.076965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.089468 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.100123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.100188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.100198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.100211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.100220 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.105572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.118132 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.128267 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.142060 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.154319 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.170042 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.182871 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.195296 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.203673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.203717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.203729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.203747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.203759 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.306535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.306606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.306625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.306650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.306670 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.314518 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/0.log" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.314685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerStarted","Data":"4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.327635 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6287c74e-b672-43a2-adfc-1dc18d7bbd8b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.343633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.361394 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.376097 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.389104 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.402224 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.410181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.410239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.410251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.410271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.410283 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.416080 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.431513 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.443537 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.457250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.468897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.480637 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.503896 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.512461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.512514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.512524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.512538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.512547 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.521109 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.540308 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.557918 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.570096 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.582454 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.597002 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:01Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.614561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.614589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.614602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.614619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.614629 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.716826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.716854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.716863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.716877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.716885 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.819496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.819532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.819546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.819560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.819569 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.894285 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:01 crc kubenswrapper[4735]: E1008 12:57:01.894466 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.922346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.922372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.922397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.922409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:01 crc kubenswrapper[4735]: I1008 12:57:01.922419 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:01Z","lastTransitionTime":"2025-10-08T12:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.024562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.024596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.024606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.024621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.024632 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.127102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.127134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.127145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.127162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.127172 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.229354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.229387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.229399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.229415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.229426 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.332397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.332434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.332442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.332455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.332464 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.435362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.435400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.435411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.435425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.435434 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.540000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.540187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.540211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.540369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.540391 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.645102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.645169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.645191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.645220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.645242 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.747134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.747190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.747207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.747231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.747248 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.849192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.849231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.849239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.849254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.849263 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.894040 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:02 crc kubenswrapper[4735]: E1008 12:57:02.894215 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.894462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.894563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:02 crc kubenswrapper[4735]: E1008 12:57:02.894755 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:02 crc kubenswrapper[4735]: E1008 12:57:02.894652 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.952714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.952791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.952814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.952844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:02 crc kubenswrapper[4735]: I1008 12:57:02.952880 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:02Z","lastTransitionTime":"2025-10-08T12:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.055932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.056152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.056238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.056300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.056367 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.158174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.158228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.158247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.158270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.158287 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.260717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.260753 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.260784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.260803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.260812 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.363933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.363980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.363990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.364005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.364015 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.466431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.466468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.466492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.466507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.466515 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.568643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.568689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.568702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.568721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.568733 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.671291 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.671333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.671343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.671358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.671370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.774369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.774425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.774435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.774449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.774458 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.876929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.876960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.876968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.877006 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.877018 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.893497 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:03 crc kubenswrapper[4735]: E1008 12:57:03.893608 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.980670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.980733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.980745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.980757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:03 crc kubenswrapper[4735]: I1008 12:57:03.980766 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:03Z","lastTransitionTime":"2025-10-08T12:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.083926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.084228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.084363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.084430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.084542 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.188258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.188646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.188741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.188816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.188872 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.291643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.291674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.291686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.291699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.291709 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.394145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.394187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.394198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.394214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.394225 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.496842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.496878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.496889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.496904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.496914 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.599002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.599282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.599355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.599432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.599525 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.702067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.702126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.702143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.702168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.702193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.807204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.807247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.807256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.807272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.807282 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.894359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.894420 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:04 crc kubenswrapper[4735]: E1008 12:57:04.894496 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.894380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:04 crc kubenswrapper[4735]: E1008 12:57:04.894803 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:04 crc kubenswrapper[4735]: E1008 12:57:04.894986 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.895029 4735 scope.go:117] "RemoveContainer" containerID="591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.909372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.909404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.909415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.909430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:04 crc kubenswrapper[4735]: I1008 12:57:04.909441 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:04Z","lastTransitionTime":"2025-10-08T12:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.013117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.013405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.013415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.013429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.013438 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.116175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.116230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.116238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.116251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.116259 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.219793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.219865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.219878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.219896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.219933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.322762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.322796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.322804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.322817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.322826 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.327102 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/2.log" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.329317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.330170 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.344743 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.356811 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.368324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.380703 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.394936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.404714 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.419168 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.424619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.424660 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.424673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.424689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.424699 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.478241 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.490325 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.501422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.514803 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.527636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.527679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.527692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.527744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.527759 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.532198 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.544982 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.567757 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.586504 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.599767 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.614926 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.625318 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.629700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.629735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.629748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.629763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.629772 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.635948 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6287c74e-b672-43a2-adfc-1dc18d7bbd8b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.732868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.732920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.732928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.732947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.732957 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.835844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.835920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.835936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.835957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.835974 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.893856 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:05 crc kubenswrapper[4735]: E1008 12:57:05.893969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.938576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.938600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.938608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.938620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:05 crc kubenswrapper[4735]: I1008 12:57:05.938629 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:05Z","lastTransitionTime":"2025-10-08T12:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.041532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.041585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.041600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.041620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.041633 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.145392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.145444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.145453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.145489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.145506 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.248691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.248754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.248773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.248796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.248815 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.337099 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/3.log" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.338638 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/2.log" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.343569 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" exitCode=1 Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.343650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.343716 4735 scope.go:117] "RemoveContainer" containerID="591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.345013 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 12:57:06 crc kubenswrapper[4735]: E1008 12:57:06.345292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.356865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.356964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.356981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.357058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.357084 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.368189 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.389247 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.408017 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.426036 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.446532 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.461407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.461467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.461520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.461542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.461556 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.464030 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.486048 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.507250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.525136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.540146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.555162 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.564167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.564213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.564227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.564278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.564293 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.606754 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://591cd8e8896add07ee229791aae75932033be8070379a70d7a0c3f42176c6357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:38Z\\\",\\\"message\\\":\\\"ere column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1008 12:56:38.817327 6363 services_controller.go:454] Service openshift-authentication-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1008 12:56:38.818364 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818678 6363 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1008 12:56:38.818681 6363 port_cache.go:96] port-cache(openshift-network-console_networking-console-plugin-85b44fc459-gdk6g): added port \\\\u0026{name:openshift-network-console_networking-console-plugin-85b44fc459-gdk6g uuid:c94130be-172c-477c-88c4-40cc7eba30fe logicalSwitch:crc ips:[0xc008a013e0] mac:[10 88 10 217 0 92] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.92/23] and MAC: 0a:58:0a:d9:00:5c\\\\nI1008 12:56:38.818690 6363 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\\\\\" but failed to find it\\\\nI1008 12:56:38.818694 6363 pods.go:252] [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] addLogi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:57:06Z\\\",\\\"message\\\":\\\"o run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 12:57:05.867649 6717 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:57:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.632452 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.657950 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.667524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.667673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.667686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.667702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.667724 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.677127 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.699722 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.713197 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.723335 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.732667 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6287c74e-b672-43a2-adfc-1dc18d7bbd8b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:06Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.770192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.770251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.770268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.770290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.770306 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.872703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.872776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.872811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.872841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.872865 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.893898 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.893909 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:06 crc kubenswrapper[4735]: E1008 12:57:06.894153 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.894300 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:06 crc kubenswrapper[4735]: E1008 12:57:06.894021 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:06 crc kubenswrapper[4735]: E1008 12:57:06.894462 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.975418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.975469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.975512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.975527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:06 crc kubenswrapper[4735]: I1008 12:57:06.975537 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:06Z","lastTransitionTime":"2025-10-08T12:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.078676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.078742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.078766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.078794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.078816 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.182234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.182295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.182317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.182345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.182367 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.286291 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.286331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.286341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.286357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.286367 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.351207 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/3.log" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.356640 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 12:57:07 crc kubenswrapper[4735]: E1008 12:57:07.357259 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.376883 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.389710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.389747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.389759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.389777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.389792 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.393241 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.411616 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.427784 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.442465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.457905 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.481760 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.492187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.492509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.492780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.493007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.493207 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.516220 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.533985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.547055 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.559967 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.571750 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.583456 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.596099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.596146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.596163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.596185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.596200 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.602423 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:57:06Z\\\",\\\"message\\\":\\\"o run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 12:57:05.867649 6717 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:57:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.614990 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6287c74e-b672-43a2-adfc-1dc18d7bbd8b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.630061 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.643166 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.654781 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.663736 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:07Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.698722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.698879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.698954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.699032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.699101 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.801665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.802276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.803053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.803528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.803906 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.894278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:07 crc kubenswrapper[4735]: E1008 12:57:07.894417 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.907662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.907991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.908382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.908648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:07 crc kubenswrapper[4735]: I1008 12:57:07.908823 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:07Z","lastTransitionTime":"2025-10-08T12:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.012748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.012832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.012874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.012907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.012928 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.115394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.115430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.115441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.115456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.115468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.217948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.217979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.217988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.218000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.218009 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.324426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.324540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.324564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.324590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.324614 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.327232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.327277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.327293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.327313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.327328 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.341685 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:08Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.346673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.346898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.347170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.347306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.347427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.367852 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:08Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.371654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.371688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.371697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.371712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.371721 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.383836 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:08Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.387231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.387272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.387285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.387303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.387314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.399562 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:08Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.402664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.402723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.402741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.402766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.402784 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.416932 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:08Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.417168 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.427099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.427326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.427416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.427544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.427648 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.531420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.531678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.531731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.531767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.531813 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.634068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.634115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.634131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.634153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.634171 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.736742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.736811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.736833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.736850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.736862 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.839916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.839962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.839974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.839994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.840006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.894240 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.894297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.894365 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.894437 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.894701 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:08 crc kubenswrapper[4735]: E1008 12:57:08.894868 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.943063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.943122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.943138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.943164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:08 crc kubenswrapper[4735]: I1008 12:57:08.943182 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:08Z","lastTransitionTime":"2025-10-08T12:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.045910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.046201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.046341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.046516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.046621 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.148973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.149015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.149030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.149049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.149067 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.252220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.252316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.252334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.252361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.252387 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.355733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.355788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.355809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.355838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.355861 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.458674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.459648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.459822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.459995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.460173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.563561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.563638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.563655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.563682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.563699 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.666463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.666750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.666887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.666979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.667073 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.770402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.770513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.770531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.770584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.770598 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.873461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.873528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.873538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.873553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.873596 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.894117 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:09 crc kubenswrapper[4735]: E1008 12:57:09.894272 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.981560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.981719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.981738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.981760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:09 crc kubenswrapper[4735]: I1008 12:57:09.981774 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:09Z","lastTransitionTime":"2025-10-08T12:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.085525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.085578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.085598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.085620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.085636 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.189314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.189369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.189385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.189408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.189424 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.292714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.292980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.293065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.293194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.293304 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.396217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.396255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.396265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.396280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.396289 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.499817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.500870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.501074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.501260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.501530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.606445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.606514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.606530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.606549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.606560 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.711438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.712011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.712243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.712459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.712742 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.814902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.815136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.815346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.815461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.815593 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.894229 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.894362 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.894298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:10 crc kubenswrapper[4735]: E1008 12:57:10.894792 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:10 crc kubenswrapper[4735]: E1008 12:57:10.894891 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:10 crc kubenswrapper[4735]: E1008 12:57:10.894663 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.909823 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:10Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.918775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.918814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.918827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.918846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.918861 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:10Z","lastTransitionTime":"2025-10-08T12:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.925059 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:10Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.945565 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:57:06Z\\\",\\\"message\\\":\\\"o run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 12:57:05.867649 6717 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:57:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:10Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:10 crc kubenswrapper[4735]: I1008 12:57:10.967451 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:10Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.000164 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:10Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.021164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.021102 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.021202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.021213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.021229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.021241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.035694 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.046738 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.057549 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.067438 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6287c74e-b672-43a2-adfc-1dc18d7bbd8b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.078400 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.091967 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.103994 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.113355 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.123779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.123816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.123825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.123840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.123850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.126028 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.136670 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.147228 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.156653 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.166844 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:11Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.227169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.227232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.227248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.227272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.227290 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.329600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.329649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.329665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.329689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.329705 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.432838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.432872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.432880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.432903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.432912 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.535445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.536010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.536088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.536273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.536362 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.639729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.640142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.640356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.640623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.640834 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.744026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.744508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.744617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.744723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.744827 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.849247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.849314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.849333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.849359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.849377 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.894092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:11 crc kubenswrapper[4735]: E1008 12:57:11.894656 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.952128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.952197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.952221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.952252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:11 crc kubenswrapper[4735]: I1008 12:57:11.952273 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:11Z","lastTransitionTime":"2025-10-08T12:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.055567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.055680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.055714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.055729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.055740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.158808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.158863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.158878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.158892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.158902 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.261848 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.261887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.261898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.261911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.261921 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.364751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.364812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.364826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.364853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.364868 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.467913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.467971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.467984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.468004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.468018 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.571497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.571543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.571565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.571593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.571605 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.675183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.675251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.675274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.675305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.675323 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.778095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.778153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.778167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.778185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.778196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.882321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.882382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.882395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.882421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.882435 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.894230 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.894297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:12 crc kubenswrapper[4735]: E1008 12:57:12.894398 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.896701 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:12 crc kubenswrapper[4735]: E1008 12:57:12.897028 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:12 crc kubenswrapper[4735]: E1008 12:57:12.897247 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.986459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.986548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.986562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.986584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:12 crc kubenswrapper[4735]: I1008 12:57:12.986599 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:12Z","lastTransitionTime":"2025-10-08T12:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.088966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.089450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.089644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.089849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.089992 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.194662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.195181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.195502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.195718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.195960 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.300624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.300711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.300739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.300774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.300794 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.404290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.404354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.404366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.404388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.404403 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.506774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.506831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.506848 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.506872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.506889 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.609760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.609808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.609820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.609835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.609847 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.712225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.712268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.712286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.712308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.712324 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.815130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.815454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.815662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.815776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.815868 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.893797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:13 crc kubenswrapper[4735]: E1008 12:57:13.893986 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.918776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.919081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.919239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.919364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:13 crc kubenswrapper[4735]: I1008 12:57:13.919510 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:13Z","lastTransitionTime":"2025-10-08T12:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.022035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.022090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.022101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.022128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.022152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.123896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.123939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.123949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.123964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.123974 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.226086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.226127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.226139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.226155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.226166 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.328774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.329032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.329183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.329329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.329511 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.432190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.432225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.432233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.432247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.432257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.542894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.543150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.543171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.543192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.543210 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.646083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.646155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.646165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.646186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.646197 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.674047 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.674245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674304 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.674262338 +0000 UTC m=+148.482567062 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.674376 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674439 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.674462 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674547 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.674526265 +0000 UTC m=+148.482830809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674611 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674686 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674744 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674760 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674726 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.67471226 +0000 UTC m=+148.483017004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.674850 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.674826602 +0000 UTC m=+148.483131336 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.748863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.748913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.748922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.748942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.748955 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.775757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.776031 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.776070 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.776087 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.776163 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.776138416 +0000 UTC m=+148.584442960 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.851962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.852012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.852026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.852046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.852061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.894275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.894275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.894444 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.894572 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.894752 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:14 crc kubenswrapper[4735]: E1008 12:57:14.894814 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.955061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.955413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.955423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.955439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:14 crc kubenswrapper[4735]: I1008 12:57:14.955451 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:14Z","lastTransitionTime":"2025-10-08T12:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.058415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.058470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.058495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.058512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.058525 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.161273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.161339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.161362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.161392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.161415 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.264009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.264053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.264061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.264077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.264087 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.367003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.367042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.367054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.367069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.367080 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.469854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.469914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.469925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.469942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.469956 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.572033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.572079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.572092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.572111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.572122 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.675344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.675404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.675415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.675436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.675450 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.778196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.778252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.778264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.778280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.778290 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.880558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.880596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.880605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.880619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.880629 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.894175 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:15 crc kubenswrapper[4735]: E1008 12:57:15.894298 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.983436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.983468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.983499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.983513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:15 crc kubenswrapper[4735]: I1008 12:57:15.983521 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:15Z","lastTransitionTime":"2025-10-08T12:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.086350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.086432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.086441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.086453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.086462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.189879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.189934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.189949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.189967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.189978 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.292844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.292878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.292888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.292903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.292911 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.395242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.395501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.395643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.395716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.395787 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.498266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.498312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.498328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.498349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.498365 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.601267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.601360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.601392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.601416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.601435 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.704110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.704152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.704338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.704356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.704368 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.807001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.807039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.807049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.807064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.807076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.894767 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.894821 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.894871 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:16 crc kubenswrapper[4735]: E1008 12:57:16.895360 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:16 crc kubenswrapper[4735]: E1008 12:57:16.895642 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:16 crc kubenswrapper[4735]: E1008 12:57:16.895707 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.909911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.909956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.909968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.909986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:16 crc kubenswrapper[4735]: I1008 12:57:16.910000 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:16Z","lastTransitionTime":"2025-10-08T12:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.013027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.013069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.013081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.013099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.013111 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.116116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.116176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.116197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.116223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.116241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.219228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.219290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.219307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.219330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.219349 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.322150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.322195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.322205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.322220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.322230 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.425014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.425051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.425062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.425080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.425093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.527602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.527647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.527659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.527678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.527694 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.630505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.630744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.630861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.630961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.631054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.734277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.734340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.734359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.734382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.734401 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.837212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.837241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.837248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.837260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.837267 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.894430 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:17 crc kubenswrapper[4735]: E1008 12:57:17.894599 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.939640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.939675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.939683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.939696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:17 crc kubenswrapper[4735]: I1008 12:57:17.939704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:17Z","lastTransitionTime":"2025-10-08T12:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.042706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.042743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.042752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.042766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.042778 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.145151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.145203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.145220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.145242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.145260 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.247954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.247996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.248009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.248025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.248036 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.351126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.351190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.351209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.351235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.351253 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.454653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.454724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.454742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.454767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.454784 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.557787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.557865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.557884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.557910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.557930 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.614651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.614717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.614728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.614753 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.614765 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.629862 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.633927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.634089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.634156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.634222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.634286 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.649693 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.653901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.653930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.653942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.653961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.653975 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.671231 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.676034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.676075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.676088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.676106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.676119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.689059 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.692888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.692938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.692952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.692970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.692984 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.706114 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0d020dc1-103d-4051-9317-7ecb0eb982ec\\\",\\\"systemUUID\\\":\\\"14046600-b993-4087-9f7d-570e1efe09a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:18Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.706270 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.707798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.707826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.707837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.707855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.707867 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.810115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.810647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.810772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.810859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.810950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.894323 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.894362 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.894322 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.894453 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.894607 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:18 crc kubenswrapper[4735]: E1008 12:57:18.894671 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.913785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.913986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.914072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.914133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:18 crc kubenswrapper[4735]: I1008 12:57:18.914193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:18Z","lastTransitionTime":"2025-10-08T12:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.017665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.017762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.017790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.017820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.017843 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.121526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.121780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.122030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.122270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.122450 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.225309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.225609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.225739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.225847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.225933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.328550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.328600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.328616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.328635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.328650 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.443364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.443421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.443442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.443467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.443515 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.545800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.545890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.545927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.545952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.545968 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.649373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.649431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.649449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.649471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.649520 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.752061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.752094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.752102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.752116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.752124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.854751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.854883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.854904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.854929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.854947 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.894804 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:19 crc kubenswrapper[4735]: E1008 12:57:19.895011 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.958394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.958455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.958467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.958503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:19 crc kubenswrapper[4735]: I1008 12:57:19.958518 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:19Z","lastTransitionTime":"2025-10-08T12:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.061547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.061601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.061617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.061640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.061657 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.164702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.164768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.164794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.164826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.164849 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.267726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.267789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.267806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.267831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.267850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.370917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.370963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.370977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.370997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.371013 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.474151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.474210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.474233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.474265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.474286 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.577245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.577315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.577333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.577357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.577376 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.680563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.681020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.681079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.681104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.681123 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.784838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.784904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.784922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.784948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.784972 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.889237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.889317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.889340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.889370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.889391 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.893818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.893899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.893830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:20 crc kubenswrapper[4735]: E1008 12:57:20.894087 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:20 crc kubenswrapper[4735]: E1008 12:57:20.894191 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:20 crc kubenswrapper[4735]: E1008 12:57:20.894338 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.928345 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653370-cd31-4487-8b63-50b145f55853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:57:06Z\\\",\\\"message\\\":\\\"o run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 12:57:05.867649 6717 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:57:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hmnq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wckcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.952303 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tsff9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0464374-65ce-46f6-9e7e-79018b92aff4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce56f02b4d3ab37c171c10113c79b74660cbdc570b55bd951baae65ef4276946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01ac6683194ace06c37ecdbcda8584329034bc8fd0e6c532dcc6d7a036a72b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48e086e1263cfb47daf4b8a395b1335f2e575f9d8c060983ba7090bbe27033d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aceb87b6bfa4020ca777a104430a285d886fdd8fb37b710f754eac6ad1b4304d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895f5b597faf09af3dfd2688b3e42c2f1713b07e7d78c8d81e856dc3364727f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://196d2d4913959ac507043de8431ea9294020c429fa0810c9985c77a23f52ad78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5ebc8f08568ab230481a7570dcdb0c6ed926f2216ef10de368d835811ef129\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:56:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzwrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tsff9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.984552 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"844f0143-e7a8-48c9-ae8e-753c5c293f9d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e268579ace7ae7961414b2650b6b55471a9fd6574196c5fa2cefbf6979464b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e620e3a2b5b27ac9469d3ca045be90d19b04f969422d4c2a453650a7d06f0918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e7006d4b199ac7450c0e968adf9714db905f5378e279db304db244d0e5e566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f60bc244cbde5bb8033c24051f07fd6a2a168d97d89d3abcfaac69e6d75878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa627e3bfab38d53beee2a3201f6856009919d567108614667503273d11d0f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2b0f6e8f6c2a0795b31d05d1f47097990994d9e63c42a7043d7871bd9ea45c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b17fb5d885822b263f99fe529984c131d2d42e0b822b4d8a70053f7219f45a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7c189373f7bc4c3644dc7ba1880bccdfdd3a910ad6e597160bd295be2db5a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:20Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.997703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.997775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.997796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.997834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:20 crc kubenswrapper[4735]: I1008 12:57:20.997854 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:20Z","lastTransitionTime":"2025-10-08T12:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.004320 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d7f6479-86cc-40be-8b44-cb1c10627aa2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc2f1c56c7be63150786ec037b77127e2df6071efa7d736d402bca8a011ea32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e2c83b125f3789eeff231ab5416af4461e2443e6abaf28dac1cd5a8d5356287\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://302c54532d7a142b5a9483334e74f1d96b0f07920d9c3fcb589bf2b52d1c7124\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6272fc7157705e3f1cd86a22b80f722dc7d152785bcc8029e131cd03568a19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a92ee1544b61eda7e486e807f489c1f07cfc70870a2413575ac443203670542\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 12:56:04.373062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 12:56:04.379553 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3954726057/tls.crt::/tmp/serving-cert-3954726057/tls.key\\\\\\\"\\\\nI1008 12:56:10.064922 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 12:56:10.069814 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 12:56:10.069844 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 12:56:10.069873 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 12:56:10.069878 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 12:56:10.082030 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 12:56:10.082115 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 12:56:10.082144 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 12:56:10.082156 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 12:56:10.082166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 12:56:10.082177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1008 12:56:10.082039 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1008 12:56:10.085350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f517b378c42cea209cb39c077a6162a3569671ed2c8f91ae7b0b2c411af287a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e0a8673cf4b16176eb22bc463528e87ab8ee31b6f8ca3881376f9510b2087f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.017965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aed4e152-1619-49cb-8449-bc39785f556b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af1083a706e809f20ad64584fcbdfee479bb7014566eab93c89d94c67e013c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2326bc324b30300d3c85d19e5434de503b355b4e29fb72a13c8ab8d4884899c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9cf1a23dd55079fc635d7547125bc23b35d999c371c0bc3dd9e7e4a7c0c9fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e77c463c65fc07f8e0c2c8dd054b9dd96103da7426bbf1cd6e002436018231b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.038245 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.051326 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6814f3ae991f4c4576eaafb7f325021fe4ef3387b84948efb093150a7974d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.067015 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2dfd0474-f64a-4d3b-9990-df182d079b10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4321c24e125d489ad4c28ca906b5143f7017d14e0424cb188a1a031fd07de603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl6np\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9qwgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.078726 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6287c74e-b672-43a2-adfc-1dc18d7bbd8b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa7259e4ff7dc49440079659b12b25117daf1733407e99f44156ec0c1165bd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60ea228b3e0b96a8cdc6daa4a9e832d6b356c3e2b4413812cf1c03e9de76da2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.094298 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d289b739-3ef2-4ee7-bb9a-b5cd8ab2b999\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9030479e031ff1768e372e6da2a0806dc393db70aed31961e3a9c89a9e789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852a4dea1c1d8a40fc2ba89661ff17845ee58b62e45f4a7df9a8af505d361dd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e511b82acaea281ca468fbdce1eaa7ec3a5df59d91dc90fc500bebb08d51b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ce3b507093adcc6becc1a3b28e836174ee50ff4d3f583aa84badf4c1f3cb1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:55:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.100148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.100351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.100456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.100575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.100682 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.109015 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b69f0d46aba2faee01906db32a5a6785e755b2eee94ccbaf1e438c63798f3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.123753 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.134146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qdvjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28749736-d31e-40f2-84e8-a7839d8da376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://680500d4224fd49593a4fa7294d1c49f059f330b2be33e3ef9ceccc6bd9b9a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcv62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qdvjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.148790 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.162869 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28bb195cc66fff1263a0e92d27d57e97c81b63f873cb5e85d5271ac4fad40a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e88fb0202a08e60ff9cb6528688a4312ec5433491918326a5b46a5fb5e1de888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.175946 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tfdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T12:56:59Z\\\",\\\"message\\\":\\\"2025-10-08T12:56:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa\\\\n2025-10-08T12:56:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3ad7249b-2c19-4671-b951-a27dbb0f1daa to /host/opt/cni/bin/\\\\n2025-10-08T12:56:14Z [verbose] multus-daemon started\\\\n2025-10-08T12:56:14Z [verbose] Readiness Indicator file check\\\\n2025-10-08T12:56:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qwbk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tfdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.191087 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d50458a5-b349-4384-9451-69dcb6f4e3d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29452c460590f3b977d7a9a86ec5bc26ed6d08054087cc55308bb8a0e6bcf966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ecd6dd311d753fc93739fd3f2da3af63485387305d10887aa1b5425b0c7dbd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kls7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d8pqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.202908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.203204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.203399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.203653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.203779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.205427 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mldxj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"771d7ef1-3d58-44eb-89b5-43a5da3e8091\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk2wl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mldxj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.217411 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jnmqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed550b4-694a-4a43-99fc-f869224d6cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T12:56:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0a97189f5acb11a4baa7838d67360d6606941daf7520c583dd596cfab265203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T12:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qjms\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T12:56:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jnmqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T12:57:21Z is after 2025-08-24T17:21:41Z" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.307039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.307463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.307688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.307907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.308120 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.410151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.410420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.410710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.410884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.410973 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.516127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.516246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.516272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.516303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.516330 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.620983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.621035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.621046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.621063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.621075 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.724808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.724903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.724934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.724963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.725020 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.828351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.828415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.828434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.828457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.828480 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.894739 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:21 crc kubenswrapper[4735]: E1008 12:57:21.895294 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.895501 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 12:57:21 crc kubenswrapper[4735]: E1008 12:57:21.895733 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.931091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.931143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.931164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.931211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:21 crc kubenswrapper[4735]: I1008 12:57:21.931229 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:21Z","lastTransitionTime":"2025-10-08T12:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.033906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.033977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.033993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.034011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.034024 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.136904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.137196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.137314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.137442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.137627 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.240951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.241012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.241024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.241039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.241048 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.343415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.343724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.343810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.343912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.344006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.446445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.446573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.446601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.446632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.446654 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.548974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.549012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.549024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.549058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.549069 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.651378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.651420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.651431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.651447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.651457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.754877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.754928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.754939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.754958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.754968 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.858326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.858894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.858974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.859101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.859192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.894446 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.894584 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:22 crc kubenswrapper[4735]: E1008 12:57:22.894681 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.894764 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:22 crc kubenswrapper[4735]: E1008 12:57:22.894918 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:22 crc kubenswrapper[4735]: E1008 12:57:22.895010 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.963105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.963161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.963174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.963196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:22 crc kubenswrapper[4735]: I1008 12:57:22.963210 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:22Z","lastTransitionTime":"2025-10-08T12:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.066580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.066673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.066685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.066705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.066717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.170480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.170946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.171026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.171112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.171212 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.274991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.275057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.275072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.275097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.275109 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.377759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.377828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.377846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.377873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.377892 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.481072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.481318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.481388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.481467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.481577 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.584908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.584956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.584974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.584991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.585003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.687570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.687627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.687639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.687659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.687673 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.791025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.791079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.791090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.791107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.791118 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.893627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.893630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.893698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.893709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.893724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.893738 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:23 crc kubenswrapper[4735]: E1008 12:57:23.894029 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.996190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.996282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.996337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.996372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:23 crc kubenswrapper[4735]: I1008 12:57:23.996393 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:23Z","lastTransitionTime":"2025-10-08T12:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.099322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.099358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.099369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.099384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.099396 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.202045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.202118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.202129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.202146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.202156 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.305235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.305599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.305822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.306055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.306206 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.410127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.410593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.410769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.410920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.411062 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.513880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.514153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.514362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.514692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.514802 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.617365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.617439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.617456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.617501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.617520 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.720990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.721378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.721544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.721707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.721829 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.824672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.824709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.824719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.824732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.824741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.893989 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.894055 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.894026 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:24 crc kubenswrapper[4735]: E1008 12:57:24.894181 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:24 crc kubenswrapper[4735]: E1008 12:57:24.894291 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:24 crc kubenswrapper[4735]: E1008 12:57:24.894384 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.926860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.926907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.926921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.926937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:24 crc kubenswrapper[4735]: I1008 12:57:24.926949 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:24Z","lastTransitionTime":"2025-10-08T12:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.029692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.030440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.030483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.030547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.030569 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.134004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.134071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.134094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.134124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.134149 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.236936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.236978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.236991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.237008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.237020 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.339253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.339314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.339330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.339352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.339369 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.441765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.442149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.442306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.442449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.442708 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.545872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.545910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.545925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.545941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.545952 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.648773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.648843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.648854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.648878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.648891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.751688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.751750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.751761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.751777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.751789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.854653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.854894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.854989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.855085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.855184 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.893684 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:25 crc kubenswrapper[4735]: E1008 12:57:25.894420 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.957919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.957992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.958004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.958018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:25 crc kubenswrapper[4735]: I1008 12:57:25.958028 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:25Z","lastTransitionTime":"2025-10-08T12:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.060470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.060521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.060532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.060551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.060562 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.163363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.163419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.163432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.163450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.163461 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.265634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.265671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.265679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.265692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.265703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.367772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.368547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.368560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.368573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.368590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.471539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.471572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.471582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.471600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.471616 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.574558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.574621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.574639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.574663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.574680 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.677850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.677927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.677952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.677981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.678006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.781324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.781361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.781372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.781389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.781399 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.884840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.884913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.884938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.884971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.884995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.894387 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:26 crc kubenswrapper[4735]: E1008 12:57:26.894634 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.894784 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:26 crc kubenswrapper[4735]: E1008 12:57:26.894971 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.895016 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:26 crc kubenswrapper[4735]: E1008 12:57:26.895202 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.987699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.988133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.988334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.988602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:26 crc kubenswrapper[4735]: I1008 12:57:26.988807 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:26Z","lastTransitionTime":"2025-10-08T12:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.091416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.091467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.091509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.091526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.091537 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.193975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.194015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.194026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.194042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.194061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.296124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.296163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.296174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.296191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.296202 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.398952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.399012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.399026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.399045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.399061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.501860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.501917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.501934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.501957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.501979 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.605201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.605265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.605281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.605309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.605327 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.708720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.708814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.708837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.708868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.708889 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.812217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.812671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.812829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.812980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.813124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.894623 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:27 crc kubenswrapper[4735]: E1008 12:57:27.894826 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.922211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.922283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.922306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.922337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:27 crc kubenswrapper[4735]: I1008 12:57:27.922358 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:27Z","lastTransitionTime":"2025-10-08T12:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.024922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.024977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.024993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.025012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.025026 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.127546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.127581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.127589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.127602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.127610 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.230586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.230625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.230636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.230651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.230660 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.332831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.332901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.332913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.332953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.332967 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.435733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.435771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.435780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.435815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.435826 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.538766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.538830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.538848 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.538875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.538894 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.642089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.642172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.642196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.642224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.642242 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.744371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.744418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.744430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.744455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.744467 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.847147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.847195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.847239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.847278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.847293 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.894456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.894552 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:28 crc kubenswrapper[4735]: E1008 12:57:28.894653 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.894457 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:28 crc kubenswrapper[4735]: E1008 12:57:28.894896 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:28 crc kubenswrapper[4735]: E1008 12:57:28.894946 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.904956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.905016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.905040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.905069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.905093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T12:57:28Z","lastTransitionTime":"2025-10-08T12:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.950612 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76"] Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.951095 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.952823 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.953168 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.953416 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.954053 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 08 12:57:28 crc kubenswrapper[4735]: I1008 12:57:28.981245 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=28.98122703 podStartE2EDuration="28.98122703s" podCreationTimestamp="2025-10-08 12:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:28.965276116 +0000 UTC m=+98.773580660" watchObservedRunningTime="2025-10-08 12:57:28.98122703 +0000 UTC m=+98.789531574" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.000633 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.000595149 podStartE2EDuration="1m18.000595149s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:28.982182654 +0000 UTC m=+98.790487198" watchObservedRunningTime="2025-10-08 12:57:29.000595149 +0000 UTC m=+98.808899703" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.026665 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qdvjc" podStartSLOduration=78.026650283 podStartE2EDuration="1m18.026650283s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.0265224 +0000 UTC m=+98.834826944" watchObservedRunningTime="2025-10-08 12:57:29.026650283 +0000 UTC m=+98.834954827" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.038065 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.038121 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.038170 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.038225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.038252 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.085628 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tfdqx" podStartSLOduration=78.08560897 podStartE2EDuration="1m18.08560897s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.085413785 +0000 UTC m=+98.893718349" watchObservedRunningTime="2025-10-08 12:57:29.08560897 +0000 UTC m=+98.893913524" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.110461 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d8pqk" podStartSLOduration=77.110446073 podStartE2EDuration="1m17.110446073s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.100432926 +0000 UTC m=+98.908737480" watchObservedRunningTime="2025-10-08 12:57:29.110446073 +0000 UTC m=+98.918750607" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.120575 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jnmqc" podStartSLOduration=78.120555433 podStartE2EDuration="1m18.120555433s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.120367169 +0000 UTC m=+98.928671723" watchObservedRunningTime="2025-10-08 12:57:29.120555433 +0000 UTC m=+98.928859977" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.132314 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podStartSLOduration=78.132291533 podStartE2EDuration="1m18.132291533s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.131228507 +0000 UTC m=+98.939533061" watchObservedRunningTime="2025-10-08 12:57:29.132291533 +0000 UTC m=+98.940596097" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139360 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139419 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139421 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.139557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.140185 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.145082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.162689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73d274eb-c45d-47bd-8a68-ee31b92cd8b6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzl76\" (UID: \"73d274eb-c45d-47bd-8a68-ee31b92cd8b6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.183729 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tsff9" podStartSLOduration=78.183711554 podStartE2EDuration="1m18.183711554s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.183411017 +0000 UTC m=+98.991715561" watchObservedRunningTime="2025-10-08 12:57:29.183711554 +0000 UTC m=+98.992016098" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.235833 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.235816892 podStartE2EDuration="1m18.235816892s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.234561101 +0000 UTC m=+99.042865655" watchObservedRunningTime="2025-10-08 12:57:29.235816892 +0000 UTC m=+99.044121436" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.250668 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.250650418 podStartE2EDuration="1m18.250650418s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.249732466 +0000 UTC m=+99.058037010" watchObservedRunningTime="2025-10-08 12:57:29.250650418 +0000 UTC m=+99.058954962" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.260480 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.260454961 podStartE2EDuration="50.260454961s" podCreationTimestamp="2025-10-08 12:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:29.260241325 +0000 UTC m=+99.068545869" watchObservedRunningTime="2025-10-08 12:57:29.260454961 +0000 UTC m=+99.068759505" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.263522 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.438866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" event={"ID":"73d274eb-c45d-47bd-8a68-ee31b92cd8b6","Type":"ContainerStarted","Data":"c57bfb11637da6c987e62e882a8b8f9dd2dad04f50a26f9edbe185197057bdaa"} Oct 08 12:57:29 crc kubenswrapper[4735]: I1008 12:57:29.894099 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:29 crc kubenswrapper[4735]: E1008 12:57:29.894270 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:30 crc kubenswrapper[4735]: I1008 12:57:30.251193 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:30 crc kubenswrapper[4735]: E1008 12:57:30.251375 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:57:30 crc kubenswrapper[4735]: E1008 12:57:30.251456 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs podName:771d7ef1-3d58-44eb-89b5-43a5da3e8091 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:34.25143082 +0000 UTC m=+164.059735374 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs") pod "network-metrics-daemon-mldxj" (UID: "771d7ef1-3d58-44eb-89b5-43a5da3e8091") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 12:57:30 crc kubenswrapper[4735]: I1008 12:57:30.445439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" event={"ID":"73d274eb-c45d-47bd-8a68-ee31b92cd8b6","Type":"ContainerStarted","Data":"cb1749ad468f4f3261689b1b13709662466d9e16b3ad86525ea4500e1df0fa80"} Oct 08 12:57:30 crc kubenswrapper[4735]: I1008 12:57:30.464913 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzl76" podStartSLOduration=79.464876104 podStartE2EDuration="1m19.464876104s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:30.464216498 +0000 UTC m=+100.272521102" watchObservedRunningTime="2025-10-08 12:57:30.464876104 +0000 UTC m=+100.273180658" Oct 08 12:57:30 crc kubenswrapper[4735]: I1008 12:57:30.894108 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:30 crc kubenswrapper[4735]: I1008 12:57:30.894155 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:30 crc kubenswrapper[4735]: I1008 12:57:30.894107 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:30 crc kubenswrapper[4735]: E1008 12:57:30.895243 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:30 crc kubenswrapper[4735]: E1008 12:57:30.895404 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:30 crc kubenswrapper[4735]: E1008 12:57:30.895469 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:31 crc kubenswrapper[4735]: I1008 12:57:31.894605 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:31 crc kubenswrapper[4735]: E1008 12:57:31.894800 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:32 crc kubenswrapper[4735]: I1008 12:57:32.894265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:32 crc kubenswrapper[4735]: I1008 12:57:32.894297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:32 crc kubenswrapper[4735]: I1008 12:57:32.894272 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:32 crc kubenswrapper[4735]: E1008 12:57:32.894420 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:32 crc kubenswrapper[4735]: E1008 12:57:32.894567 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:32 crc kubenswrapper[4735]: E1008 12:57:32.894648 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:33 crc kubenswrapper[4735]: I1008 12:57:33.894191 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:33 crc kubenswrapper[4735]: E1008 12:57:33.894306 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:34 crc kubenswrapper[4735]: I1008 12:57:34.893633 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:34 crc kubenswrapper[4735]: I1008 12:57:34.893929 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:34 crc kubenswrapper[4735]: I1008 12:57:34.893987 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:34 crc kubenswrapper[4735]: E1008 12:57:34.894137 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:34 crc kubenswrapper[4735]: E1008 12:57:34.894298 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:34 crc kubenswrapper[4735]: E1008 12:57:34.894391 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:35 crc kubenswrapper[4735]: I1008 12:57:35.893535 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:35 crc kubenswrapper[4735]: E1008 12:57:35.893663 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:36 crc kubenswrapper[4735]: I1008 12:57:36.894713 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:36 crc kubenswrapper[4735]: I1008 12:57:36.894897 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:36 crc kubenswrapper[4735]: I1008 12:57:36.895042 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:36 crc kubenswrapper[4735]: E1008 12:57:36.895036 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:36 crc kubenswrapper[4735]: E1008 12:57:36.895423 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:36 crc kubenswrapper[4735]: E1008 12:57:36.895800 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:36 crc kubenswrapper[4735]: I1008 12:57:36.896138 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 12:57:36 crc kubenswrapper[4735]: E1008 12:57:36.896367 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wckcs_openshift-ovn-kubernetes(c4653370-cd31-4487-8b63-50b145f55853)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" Oct 08 12:57:37 crc kubenswrapper[4735]: I1008 12:57:37.894721 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:37 crc kubenswrapper[4735]: E1008 12:57:37.895408 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:38 crc kubenswrapper[4735]: I1008 12:57:38.893714 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:38 crc kubenswrapper[4735]: E1008 12:57:38.893908 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:38 crc kubenswrapper[4735]: I1008 12:57:38.893936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:38 crc kubenswrapper[4735]: E1008 12:57:38.894070 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:38 crc kubenswrapper[4735]: I1008 12:57:38.893732 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:38 crc kubenswrapper[4735]: E1008 12:57:38.894149 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:39 crc kubenswrapper[4735]: I1008 12:57:39.894690 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:39 crc kubenswrapper[4735]: E1008 12:57:39.895190 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:40 crc kubenswrapper[4735]: I1008 12:57:40.894711 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:40 crc kubenswrapper[4735]: E1008 12:57:40.895834 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:40 crc kubenswrapper[4735]: I1008 12:57:40.896141 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:40 crc kubenswrapper[4735]: I1008 12:57:40.896658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:40 crc kubenswrapper[4735]: E1008 12:57:40.896753 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:40 crc kubenswrapper[4735]: E1008 12:57:40.896941 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:41 crc kubenswrapper[4735]: I1008 12:57:41.894123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:41 crc kubenswrapper[4735]: E1008 12:57:41.894365 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:42 crc kubenswrapper[4735]: I1008 12:57:42.893814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:42 crc kubenswrapper[4735]: E1008 12:57:42.894048 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:42 crc kubenswrapper[4735]: I1008 12:57:42.894341 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:42 crc kubenswrapper[4735]: E1008 12:57:42.894430 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:42 crc kubenswrapper[4735]: I1008 12:57:42.894680 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:42 crc kubenswrapper[4735]: E1008 12:57:42.894772 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:43 crc kubenswrapper[4735]: I1008 12:57:43.893805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:43 crc kubenswrapper[4735]: E1008 12:57:43.894406 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:44 crc kubenswrapper[4735]: I1008 12:57:44.894574 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:44 crc kubenswrapper[4735]: I1008 12:57:44.894702 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:44 crc kubenswrapper[4735]: I1008 12:57:44.894762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:44 crc kubenswrapper[4735]: E1008 12:57:44.894758 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:44 crc kubenswrapper[4735]: E1008 12:57:44.894900 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:44 crc kubenswrapper[4735]: E1008 12:57:44.895043 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:45 crc kubenswrapper[4735]: I1008 12:57:45.894567 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:45 crc kubenswrapper[4735]: E1008 12:57:45.894774 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.497592 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/1.log" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.498115 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/0.log" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.498161 4735 generic.go:334] "Generic (PLEG): container finished" podID="ef6f0aff-a0e5-477a-9392-71bc49bfa7aa" containerID="4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14" exitCode=1 Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.498194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerDied","Data":"4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14"} Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.498231 4735 scope.go:117] "RemoveContainer" containerID="33cd5d891ac473da94cdc0e5c4bcfcbe2ef79a76d13f87ad077df153496ea515" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.498858 4735 scope.go:117] "RemoveContainer" containerID="4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14" Oct 08 12:57:46 crc kubenswrapper[4735]: E1008 12:57:46.499105 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tfdqx_openshift-multus(ef6f0aff-a0e5-477a-9392-71bc49bfa7aa)\"" pod="openshift-multus/multus-tfdqx" podUID="ef6f0aff-a0e5-477a-9392-71bc49bfa7aa" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.893897 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.894002 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:46 crc kubenswrapper[4735]: E1008 12:57:46.894068 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:46 crc kubenswrapper[4735]: E1008 12:57:46.894178 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:46 crc kubenswrapper[4735]: I1008 12:57:46.894591 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:46 crc kubenswrapper[4735]: E1008 12:57:46.894706 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:47 crc kubenswrapper[4735]: I1008 12:57:47.503916 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/1.log" Oct 08 12:57:47 crc kubenswrapper[4735]: I1008 12:57:47.894670 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:47 crc kubenswrapper[4735]: E1008 12:57:47.895218 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:47 crc kubenswrapper[4735]: I1008 12:57:47.895739 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.508683 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/3.log" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.511250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerStarted","Data":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.511844 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.541063 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podStartSLOduration=97.541049263 podStartE2EDuration="1m37.541049263s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:57:48.54052258 +0000 UTC m=+118.348827114" watchObservedRunningTime="2025-10-08 12:57:48.541049263 +0000 UTC m=+118.349353807" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.812781 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mldxj"] Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.812936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:48 crc kubenswrapper[4735]: E1008 12:57:48.813116 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.894010 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.894074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:48 crc kubenswrapper[4735]: E1008 12:57:48.894142 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:48 crc kubenswrapper[4735]: E1008 12:57:48.894218 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:48 crc kubenswrapper[4735]: I1008 12:57:48.894293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:48 crc kubenswrapper[4735]: E1008 12:57:48.894348 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:50 crc kubenswrapper[4735]: E1008 12:57:50.826750 4735 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 08 12:57:50 crc kubenswrapper[4735]: I1008 12:57:50.893759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:50 crc kubenswrapper[4735]: I1008 12:57:50.893780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:50 crc kubenswrapper[4735]: I1008 12:57:50.893805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:50 crc kubenswrapper[4735]: E1008 12:57:50.899594 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:50 crc kubenswrapper[4735]: I1008 12:57:50.900152 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:50 crc kubenswrapper[4735]: E1008 12:57:50.900371 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:50 crc kubenswrapper[4735]: E1008 12:57:50.900872 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:50 crc kubenswrapper[4735]: E1008 12:57:50.901267 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:51 crc kubenswrapper[4735]: E1008 12:57:51.013047 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 12:57:52 crc kubenswrapper[4735]: I1008 12:57:52.894470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:52 crc kubenswrapper[4735]: I1008 12:57:52.894523 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:52 crc kubenswrapper[4735]: I1008 12:57:52.894509 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:52 crc kubenswrapper[4735]: E1008 12:57:52.894637 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:52 crc kubenswrapper[4735]: I1008 12:57:52.894696 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:52 crc kubenswrapper[4735]: E1008 12:57:52.894817 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:52 crc kubenswrapper[4735]: E1008 12:57:52.894855 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:52 crc kubenswrapper[4735]: E1008 12:57:52.894917 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:54 crc kubenswrapper[4735]: I1008 12:57:54.894237 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:54 crc kubenswrapper[4735]: I1008 12:57:54.894354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:54 crc kubenswrapper[4735]: I1008 12:57:54.894277 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:54 crc kubenswrapper[4735]: I1008 12:57:54.894456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:54 crc kubenswrapper[4735]: E1008 12:57:54.894464 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:54 crc kubenswrapper[4735]: E1008 12:57:54.894637 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:54 crc kubenswrapper[4735]: E1008 12:57:54.894783 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:54 crc kubenswrapper[4735]: E1008 12:57:54.894932 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:56 crc kubenswrapper[4735]: E1008 12:57:56.014142 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 12:57:56 crc kubenswrapper[4735]: I1008 12:57:56.893814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:56 crc kubenswrapper[4735]: I1008 12:57:56.893945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:56 crc kubenswrapper[4735]: I1008 12:57:56.894010 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:56 crc kubenswrapper[4735]: I1008 12:57:56.894089 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:56 crc kubenswrapper[4735]: E1008 12:57:56.894068 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:56 crc kubenswrapper[4735]: E1008 12:57:56.894238 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:56 crc kubenswrapper[4735]: E1008 12:57:56.894438 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:56 crc kubenswrapper[4735]: E1008 12:57:56.894678 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:58 crc kubenswrapper[4735]: I1008 12:57:58.893864 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:57:58 crc kubenswrapper[4735]: I1008 12:57:58.893992 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:57:58 crc kubenswrapper[4735]: I1008 12:57:58.893870 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:57:58 crc kubenswrapper[4735]: E1008 12:57:58.894092 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:57:58 crc kubenswrapper[4735]: I1008 12:57:58.894128 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:57:58 crc kubenswrapper[4735]: E1008 12:57:58.894409 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:57:58 crc kubenswrapper[4735]: E1008 12:57:58.894618 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:57:58 crc kubenswrapper[4735]: I1008 12:57:58.894638 4735 scope.go:117] "RemoveContainer" containerID="4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14" Oct 08 12:57:58 crc kubenswrapper[4735]: E1008 12:57:58.894714 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:57:59 crc kubenswrapper[4735]: I1008 12:57:59.552584 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/1.log" Oct 08 12:57:59 crc kubenswrapper[4735]: I1008 12:57:59.552949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerStarted","Data":"a254857e1ec70adc6c42fdc42cf5c0c5398fcb735bcc4165db84ada832b4d6fc"} Oct 08 12:58:00 crc kubenswrapper[4735]: I1008 12:58:00.894634 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:58:00 crc kubenswrapper[4735]: I1008 12:58:00.894882 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:58:00 crc kubenswrapper[4735]: E1008 12:58:00.897223 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mldxj" podUID="771d7ef1-3d58-44eb-89b5-43a5da3e8091" Oct 08 12:58:00 crc kubenswrapper[4735]: I1008 12:58:00.897266 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:00 crc kubenswrapper[4735]: I1008 12:58:00.897369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:00 crc kubenswrapper[4735]: E1008 12:58:00.897510 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 12:58:00 crc kubenswrapper[4735]: E1008 12:58:00.897735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 12:58:00 crc kubenswrapper[4735]: E1008 12:58:00.897845 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.894004 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.894269 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.894320 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.894892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.897022 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.897220 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.897251 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.899160 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.899475 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 08 12:58:02 crc kubenswrapper[4735]: I1008 12:58:02.900003 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 08 12:58:08 crc kubenswrapper[4735]: I1008 12:58:08.745424 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.827834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.880355 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kkwtk"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.880995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.881453 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.882132 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.882146 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.882745 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.883370 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nnp5j"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.883723 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xkrqh"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.884128 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.884458 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.884572 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqfrr"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.884962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.898951 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rqp7"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.899621 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.899631 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.899827 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.901082 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.905714 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.905986 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.906328 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.907307 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.907616 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.907955 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.910242 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.910812 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.911239 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.911442 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.913171 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.913744 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.913883 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914059 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 08 12:58:09 crc kubenswrapper[4735]: W1008 12:58:09.914353 4735 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914437 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914397 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 08 12:58:09 crc kubenswrapper[4735]: E1008 12:58:09.914525 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914608 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914782 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914781 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.914842 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.915124 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.915237 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.916656 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.917286 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.918403 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.920538 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.921118 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.924553 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.925197 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.925266 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.925358 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.925453 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.933734 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.941934 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.942167 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.942589 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.942928 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.944018 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.944265 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.944465 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.944671 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.945691 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946004 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946120 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946177 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946251 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946280 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946422 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946461 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946538 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946648 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946770 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946843 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946652 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.947059 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.947004 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.947131 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.947200 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.946197 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.949379 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.949638 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.949727 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.949821 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.949887 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.949940 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950007 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950068 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950121 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950182 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950287 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950371 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950421 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.950934 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.951046 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.951222 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.953220 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.953314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.953531 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.954785 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.954898 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.955579 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.956561 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.957208 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.960099 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.960868 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jgkq6"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.961437 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.968681 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.968769 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.969245 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.969465 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.969731 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.969942 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.970146 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.970407 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.970960 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.975199 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.976117 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.976445 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dc7kk"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.976983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.977134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.977151 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980075 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4pwd\" (UniqueName: \"kubernetes.io/projected/79d2bd38-a2b9-4594-89bf-83c365f75f40-kube-api-access-k4pwd\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980101 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-config\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/99871c37-c1b7-48c4-bdec-d01234433109-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-oauth-config\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-encryption-config\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980169 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqdw\" (UniqueName: \"kubernetes.io/projected/25df5801-9a21-4396-8613-abddc02daf42-kube-api-access-qbqdw\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqqpn\" (UniqueName: \"kubernetes.io/projected/e2a1fa01-a8d6-421b-b66c-626360051888-kube-api-access-xqqpn\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-console-config\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980254 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswl6\" (UniqueName: \"kubernetes.io/projected/c4031192-8128-46a2-a50b-e6cb49e95b9b-kube-api-access-jswl6\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980315 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-audit-policies\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-config\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxhvp\" (UniqueName: \"kubernetes.io/projected/99871c37-c1b7-48c4-bdec-d01234433109-kube-api-access-bxhvp\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980658 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.980896 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.981076 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.981390 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.981524 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.981643 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.981648 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.981957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982014 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982168 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982232 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982273 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982296 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982350 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.982769 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544"] Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.983376 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.983832 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.984035 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.990899 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.990985 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991028 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991087 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-oauth-serving-cert\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d2bd38-a2b9-4594-89bf-83c365f75f40-trusted-ca\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-policies\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991195 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-config\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991226 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th4gr\" (UniqueName: \"kubernetes.io/projected/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-kube-api-access-th4gr\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991254 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15a08015-bcc0-4576-8f54-70e5693469a4-serving-cert\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-service-ca\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991324 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e6e79f-93b3-453f-83d1-2afc40a4637c-serving-cert\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-serving-cert\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991466 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-client-ca\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991520 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991545 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-serving-cert\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/25df5801-9a21-4396-8613-abddc02daf42-audit-dir\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991609 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99871c37-c1b7-48c4-bdec-d01234433109-config\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991638 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991670 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-client-ca\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf88x\" (UniqueName: \"kubernetes.io/projected/15a08015-bcc0-4576-8f54-70e5693469a4-kube-api-access-cf88x\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991731 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-machine-approver-tls\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99871c37-c1b7-48c4-bdec-d01234433109-images\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991815 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e2a1fa01-a8d6-421b-b66c-626360051888-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991844 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e2a1fa01-a8d6-421b-b66c-626360051888-srv-cert\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991901 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4031192-8128-46a2-a50b-e6cb49e95b9b-serving-cert\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-trusted-ca-bundle\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.991984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mh76\" (UniqueName: \"kubernetes.io/projected/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-kube-api-access-6mh76\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992040 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-auth-proxy-config\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-dir\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992121 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spz7k\" (UniqueName: \"kubernetes.io/projected/49e6e79f-93b3-453f-83d1-2afc40a4637c-kube-api-access-spz7k\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992150 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-config\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992178 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpzl\" (UniqueName: \"kubernetes.io/projected/6e891ada-e4a4-4e15-8ade-48d0f40518da-kube-api-access-grpzl\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-etcd-client\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhj9q\" (UniqueName: \"kubernetes.io/projected/ecbba0bb-2230-4624-8185-5d116632343f-kube-api-access-dhj9q\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992264 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/79d2bd38-a2b9-4594-89bf-83c365f75f40-metrics-tls\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992287 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79d2bd38-a2b9-4594-89bf-83c365f75f40-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:09 crc kubenswrapper[4735]: I1008 12:58:09.992314 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-service-ca-bundle\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.001291 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.002741 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.003390 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-887mh"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.003904 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.004028 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bvv9p"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.004120 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.004354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.004767 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gp925"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.005913 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.007368 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.007519 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.008060 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.008601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.008674 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxtmj"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.010347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.016263 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.016502 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.022444 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bqm6j"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.022877 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kkwtk"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.022976 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.023562 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.023958 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.024058 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.024368 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.024538 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.025625 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.025739 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.025874 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.028206 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cnm5q"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.028833 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5f5l"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.029116 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgv9c"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.029443 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.029744 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.030005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.030209 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.030881 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.032071 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.032486 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.032695 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.034305 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.034357 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.034968 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.035097 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.035164 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.038716 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqfrr"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.040393 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.040945 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xkrqh"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.041056 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.042990 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.043287 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nnp5j"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.044149 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.045760 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.048675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.050643 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.051669 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.053113 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.054089 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.055420 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4drk9"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.057194 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-csnmr"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.057987 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.058083 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.058368 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.058489 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.062472 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.062828 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxtmj"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.064242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gp925"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.066739 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.068598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rqp7"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.069047 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.073310 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.079455 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgv9c"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.080771 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.081897 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bqm6j"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.083211 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.083388 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dc7kk"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.084714 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.086178 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lshv7"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.088093 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7nmkc"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.088258 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.089308 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.089434 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.091123 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.092546 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-887mh"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093089 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhw57\" (UniqueName: \"kubernetes.io/projected/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-kube-api-access-dhw57\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhj9q\" (UniqueName: \"kubernetes.io/projected/ecbba0bb-2230-4624-8185-5d116632343f-kube-api-access-dhj9q\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/79d2bd38-a2b9-4594-89bf-83c365f75f40-metrics-tls\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79d2bd38-a2b9-4594-89bf-83c365f75f40-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-service-ca-bundle\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093298 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgcjb\" (UniqueName: \"kubernetes.io/projected/ec35d298-1364-4b8b-9a47-70fa26b2aa64-kube-api-access-qgcjb\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqv22\" (UniqueName: \"kubernetes.io/projected/97f188e2-ee83-4fb4-b334-33c3099df3f2-kube-api-access-dqv22\") pod \"migrator-59844c95c7-qjb22\" (UID: \"97f188e2-ee83-4fb4-b334-33c3099df3f2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f84042bd-cb46-444d-a3c9-ce1519dea75b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093560 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093578 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8r6m\" (UniqueName: \"kubernetes.io/projected/c881e3aa-6bfe-476c-9e63-edfa5480745d-kube-api-access-q8r6m\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7db24\" (UniqueName: \"kubernetes.io/projected/d24dc2f4-b906-49c0-be1e-f2326ef12664-kube-api-access-7db24\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093611 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-service-ca-bundle\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093634 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/99871c37-c1b7-48c4-bdec-d01234433109-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-oauth-config\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-encryption-config\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093707 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqdw\" (UniqueName: \"kubernetes.io/projected/25df5801-9a21-4396-8613-abddc02daf42-kube-api-access-qbqdw\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093729 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-audit\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4pwd\" (UniqueName: \"kubernetes.io/projected/79d2bd38-a2b9-4594-89bf-83c365f75f40-kube-api-access-k4pwd\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-config\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ebf462d1-8994-47ec-b534-e6dc8696bbd0-signing-key\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58d4a0d7-f90f-497b-8744-dac2ddd08868-metrics-tls\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqqpn\" (UniqueName: \"kubernetes.io/projected/e2a1fa01-a8d6-421b-b66c-626360051888-kube-api-access-xqqpn\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093873 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-console-config\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0759b5fe-0415-43f8-b773-99248d05630f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093922 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c881e3aa-6bfe-476c-9e63-edfa5480745d-audit-dir\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093969 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jswl6\" (UniqueName: \"kubernetes.io/projected/c4031192-8128-46a2-a50b-e6cb49e95b9b-kube-api-access-jswl6\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.093987 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-audit-policies\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-config\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-tmpfs\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094052 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-client\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094103 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wkk\" (UniqueName: \"kubernetes.io/projected/0759b5fe-0415-43f8-b773-99248d05630f-kube-api-access-h2wkk\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094134 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9q9\" (UniqueName: \"kubernetes.io/projected/58d4a0d7-f90f-497b-8744-dac2ddd08868-kube-api-access-md9q9\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094151 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxhvp\" (UniqueName: \"kubernetes.io/projected/99871c37-c1b7-48c4-bdec-d01234433109-kube-api-access-bxhvp\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094166 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d24dc2f4-b906-49c0-be1e-f2326ef12664-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0759b5fe-0415-43f8-b773-99248d05630f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094213 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094228 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5td7v\" (UniqueName: \"kubernetes.io/projected/d169069b-e5fe-4018-974d-97cf1c1cad23-kube-api-access-5td7v\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-oauth-serving-cert\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094270 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094287 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-config\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-serving-cert\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcfb\" (UniqueName: \"kubernetes.io/projected/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-kube-api-access-trcfb\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094336 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d2bd38-a2b9-4594-89bf-83c365f75f40-trusted-ca\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-policies\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094379 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15a08015-bcc0-4576-8f54-70e5693469a4-serving-cert\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-webhook-cert\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th4gr\" (UniqueName: \"kubernetes.io/projected/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-kube-api-access-th4gr\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-service-ca\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e6e79f-93b3-453f-83d1-2afc40a4637c-serving-cert\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094470 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-serving-cert\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094507 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-etcd-client\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094578 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-client-ca\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qswh\" (UniqueName: \"kubernetes.io/projected/ebf462d1-8994-47ec-b534-e6dc8696bbd0-kube-api-access-7qswh\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-serving-cert\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094646 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/25df5801-9a21-4396-8613-abddc02daf42-audit-dir\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99871c37-c1b7-48c4-bdec-d01234433109-config\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094700 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d24dc2f4-b906-49c0-be1e-f2326ef12664-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-metrics-certs\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094747 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ebf462d1-8994-47ec-b534-e6dc8696bbd0-signing-cabundle\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c4252c-e8f3-411c-8d61-9ed06ae9a5de-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lrcx8\" (UID: \"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84042bd-cb46-444d-a3c9-ce1519dea75b-config\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf88x\" (UniqueName: \"kubernetes.io/projected/15a08015-bcc0-4576-8f54-70e5693469a4-kube-api-access-cf88x\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c881e3aa-6bfe-476c-9e63-edfa5480745d-node-pullsecrets\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-client-ca\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094845 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-serving-cert\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094861 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-service-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-machine-approver-tls\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99871c37-c1b7-48c4-bdec-d01234433109-images\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094907 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e2a1fa01-a8d6-421b-b66c-626360051888-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-config\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-stats-auth\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.094990 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4031192-8128-46a2-a50b-e6cb49e95b9b-serving-cert\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095007 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e2a1fa01-a8d6-421b-b66c-626360051888-srv-cert\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-images\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-trusted-ca-bundle\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095059 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095076 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-default-certificate\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-config\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095114 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095141 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-image-import-ca\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mh76\" (UniqueName: \"kubernetes.io/projected/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-kube-api-access-6mh76\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-encryption-config\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-dir\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095239 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spz7k\" (UniqueName: \"kubernetes.io/projected/49e6e79f-93b3-453f-83d1-2afc40a4637c-kube-api-access-spz7k\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clzmm\" (UniqueName: \"kubernetes.io/projected/e4c4252c-e8f3-411c-8d61-9ed06ae9a5de-kube-api-access-clzmm\") pod \"package-server-manager-789f6589d5-lrcx8\" (UID: \"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-auth-proxy-config\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-config\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095311 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grpzl\" (UniqueName: \"kubernetes.io/projected/6e891ada-e4a4-4e15-8ade-48d0f40518da-kube-api-access-grpzl\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095328 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-etcd-client\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095345 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84042bd-cb46-444d-a3c9-ce1519dea75b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095365 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d24dc2f4-b906-49c0-be1e-f2326ef12664-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095381 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv5r2\" (UniqueName: \"kubernetes.io/projected/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-kube-api-access-dv5r2\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095396 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-proxy-tls\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-csnmr"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.095579 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.096090 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5f5l"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.096311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99871c37-c1b7-48c4-bdec-d01234433109-images\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.097000 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-service-ca\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.098007 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.099447 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cnm5q"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.099581 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.100045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-service-ca-bundle\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.100751 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lshv7"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.100769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-config\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.100942 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e6e79f-93b3-453f-83d1-2afc40a4637c-serving-cert\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.101038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.101580 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.102642 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d2bd38-a2b9-4594-89bf-83c365f75f40-trusted-ca\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.102860 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-oauth-serving-cert\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.102900 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.103066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.103074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-dir\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.103189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-policies\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.103406 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.104809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.104856 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.104886 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-trusted-ca-bundle\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/79d2bd38-a2b9-4594-89bf-83c365f75f40-metrics-tls\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105314 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105369 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bvv9p"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105383 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-client-ca\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105656 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.105840 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.106403 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.106588 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.106595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-auth-proxy-config\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.106906 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-audit-policies\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.106956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.107115 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25df5801-9a21-4396-8613-abddc02daf42-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.107162 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.107240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4031192-8128-46a2-a50b-e6cb49e95b9b-config\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.107601 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-config\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.108218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-config\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.108294 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4drk9"] Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.108325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-console-config\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.108454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/25df5801-9a21-4396-8613-abddc02daf42-audit-dir\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.108913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.109061 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-client-ca\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.109409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e2a1fa01-a8d6-421b-b66c-626360051888-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.109803 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15a08015-bcc0-4576-8f54-70e5693469a4-serving-cert\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.110117 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-oauth-config\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.110555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-encryption-config\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.111057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4031192-8128-46a2-a50b-e6cb49e95b9b-serving-cert\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.112179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-serving-cert\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.112744 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e2a1fa01-a8d6-421b-b66c-626360051888-srv-cert\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.112902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-etcd-client\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.114048 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-machine-approver-tls\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.119947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/99871c37-c1b7-48c4-bdec-d01234433109-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.120004 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99871c37-c1b7-48c4-bdec-d01234433109-config\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.121384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25df5801-9a21-4396-8613-abddc02daf42-serving-cert\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.121614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.122760 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.142840 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.162622 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.182929 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195778 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-serving-cert\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195809 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcfb\" (UniqueName: \"kubernetes.io/projected/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-kube-api-access-trcfb\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-webhook-cert\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195848 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-etcd-client\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qswh\" (UniqueName: \"kubernetes.io/projected/ebf462d1-8994-47ec-b534-e6dc8696bbd0-kube-api-access-7qswh\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195890 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d24dc2f4-b906-49c0-be1e-f2326ef12664-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-metrics-certs\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ebf462d1-8994-47ec-b534-e6dc8696bbd0-signing-cabundle\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195948 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c4252c-e8f3-411c-8d61-9ed06ae9a5de-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lrcx8\" (UID: \"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84042bd-cb46-444d-a3c9-ce1519dea75b-config\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.195985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c881e3aa-6bfe-476c-9e63-edfa5480745d-node-pullsecrets\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-serving-cert\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-service-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196030 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-config\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196043 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-stats-auth\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196079 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-images\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-default-certificate\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-config\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-image-import-ca\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-encryption-config\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clzmm\" (UniqueName: \"kubernetes.io/projected/e4c4252c-e8f3-411c-8d61-9ed06ae9a5de-kube-api-access-clzmm\") pod \"package-server-manager-789f6589d5-lrcx8\" (UID: \"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84042bd-cb46-444d-a3c9-ce1519dea75b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c881e3aa-6bfe-476c-9e63-edfa5480745d-node-pullsecrets\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196210 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d24dc2f4-b906-49c0-be1e-f2326ef12664-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv5r2\" (UniqueName: \"kubernetes.io/projected/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-kube-api-access-dv5r2\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196241 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-proxy-tls\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhw57\" (UniqueName: \"kubernetes.io/projected/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-kube-api-access-dhw57\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196283 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgcjb\" (UniqueName: \"kubernetes.io/projected/ec35d298-1364-4b8b-9a47-70fa26b2aa64-kube-api-access-qgcjb\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196328 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqv22\" (UniqueName: \"kubernetes.io/projected/97f188e2-ee83-4fb4-b334-33c3099df3f2-kube-api-access-dqv22\") pod \"migrator-59844c95c7-qjb22\" (UID: \"97f188e2-ee83-4fb4-b334-33c3099df3f2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196342 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f84042bd-cb46-444d-a3c9-ce1519dea75b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196365 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196394 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8r6m\" (UniqueName: \"kubernetes.io/projected/c881e3aa-6bfe-476c-9e63-edfa5480745d-kube-api-access-q8r6m\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7db24\" (UniqueName: \"kubernetes.io/projected/d24dc2f4-b906-49c0-be1e-f2326ef12664-kube-api-access-7db24\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-service-ca-bundle\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196442 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-audit\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196460 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ebf462d1-8994-47ec-b534-e6dc8696bbd0-signing-key\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196495 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58d4a0d7-f90f-497b-8744-dac2ddd08868-metrics-tls\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196524 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0759b5fe-0415-43f8-b773-99248d05630f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196559 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c881e3aa-6bfe-476c-9e63-edfa5480745d-audit-dir\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196578 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f84042bd-cb46-444d-a3c9-ce1519dea75b-config\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196587 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-tmpfs\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196640 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-client\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wkk\" (UniqueName: \"kubernetes.io/projected/0759b5fe-0415-43f8-b773-99248d05630f-kube-api-access-h2wkk\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9q9\" (UniqueName: \"kubernetes.io/projected/58d4a0d7-f90f-497b-8744-dac2ddd08868-kube-api-access-md9q9\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d24dc2f4-b906-49c0-be1e-f2326ef12664-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196747 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0759b5fe-0415-43f8-b773-99248d05630f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.196764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5td7v\" (UniqueName: \"kubernetes.io/projected/d169069b-e5fe-4018-974d-97cf1c1cad23-kube-api-access-5td7v\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.197097 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-images\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.197526 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-tmpfs\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.198109 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.198407 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-service-ca-bundle\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.198636 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c881e3aa-6bfe-476c-9e63-edfa5480745d-audit-dir\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.198858 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-metrics-certs\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.200412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-stats-auth\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.201074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-default-certificate\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.201145 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f84042bd-cb46-444d-a3c9-ce1519dea75b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.201196 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-proxy-tls\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.202595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.202821 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4c4252c-e8f3-411c-8d61-9ed06ae9a5de-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lrcx8\" (UID: \"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.223921 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.243114 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.262657 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.285793 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.303912 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.323636 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.342948 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.363320 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.382919 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.389870 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-webhook-cert\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.392434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.402520 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.423125 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.443424 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.463667 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.483409 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.488468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ebf462d1-8994-47ec-b534-e6dc8696bbd0-signing-cabundle\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.503426 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.522975 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.534831 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ebf462d1-8994-47ec-b534-e6dc8696bbd0-signing-key\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.543735 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.563113 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.589241 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.600266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.602859 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.609608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.622443 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.644451 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.662776 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.683843 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.703224 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.724448 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.744668 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.764308 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.783386 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.789876 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-audit\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.804108 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.824573 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.843188 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.872852 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.883765 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.903865 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.910114 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.923423 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.929280 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-etcd-client\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.943973 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.949860 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-serving-cert\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.964402 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.969030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c881e3aa-6bfe-476c-9e63-edfa5480745d-encryption-config\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:10 crc kubenswrapper[4735]: I1008 12:58:10.983976 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.003264 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.025989 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.027370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-image-import-ca\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.042101 4735 request.go:700] Waited for 1.01714774s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/configmaps?fieldSelector=metadata.name%3Dtrusted-ca-bundle&limit=500&resourceVersion=0 Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.057083 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.059222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.062887 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.066889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c881e3aa-6bfe-476c-9e63-edfa5480745d-config\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.084897 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.107968 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.119214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d24dc2f4-b906-49c0-be1e-f2326ef12664-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.123853 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.143348 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.163207 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.169607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.185402 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.190818 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-client\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.196196 4735 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.196329 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-serving-cert podName:ec35d298-1364-4b8b-9a47-70fa26b2aa64 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.696313306 +0000 UTC m=+141.504617850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-serving-cert") pod "etcd-operator-b45778765-p5f5l" (UID: "ec35d298-1364-4b8b-9a47-70fa26b2aa64") : failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.196744 4735 secret.go:188] Couldn't get secret openshift-image-registry/image-registry-operator-tls: failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.196821 4735 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.196841 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d24dc2f4-b906-49c0-be1e-f2326ef12664-image-registry-operator-tls podName:d24dc2f4-b906-49c0-be1e-f2326ef12664 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.696816229 +0000 UTC m=+141.505120863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/d24dc2f4-b906-49c0-be1e-f2326ef12664-image-registry-operator-tls") pod "cluster-image-registry-operator-dc59b4c8b-75k9v" (UID: "d24dc2f4-b906-49c0-be1e-f2326ef12664") : failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.196980 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-service-ca podName:ec35d298-1364-4b8b-9a47-70fa26b2aa64 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.696953313 +0000 UTC m=+141.505257897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-service-ca") pod "etcd-operator-b45778765-p5f5l" (UID: "ec35d298-1364-4b8b-9a47-70fa26b2aa64") : failed to sync configmap cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.198061 4735 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.198182 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-config podName:ec35d298-1364-4b8b-9a47-70fa26b2aa64 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.698172624 +0000 UTC m=+141.506477168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-config") pod "etcd-operator-b45778765-p5f5l" (UID: "ec35d298-1364-4b8b-9a47-70fa26b2aa64") : failed to sync configmap cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.199218 4735 secret.go:188] Couldn't get secret openshift-controller-manager-operator/openshift-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.199315 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0759b5fe-0415-43f8-b773-99248d05630f-serving-cert podName:0759b5fe-0415-43f8-b773-99248d05630f nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.699291813 +0000 UTC m=+141.507596457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0759b5fe-0415-43f8-b773-99248d05630f-serving-cert") pod "openshift-controller-manager-operator-756b6f6bc6-bmrzp" (UID: "0759b5fe-0415-43f8-b773-99248d05630f") : failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.199536 4735 configmap.go:193] Couldn't get configMap openshift-controller-manager-operator/openshift-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.199649 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0759b5fe-0415-43f8-b773-99248d05630f-config podName:0759b5fe-0415-43f8-b773-99248d05630f nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.699620101 +0000 UTC m=+141.507924685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0759b5fe-0415-43f8-b773-99248d05630f-config") pod "openshift-controller-manager-operator-756b6f6bc6-bmrzp" (UID: "0759b5fe-0415-43f8-b773-99248d05630f") : failed to sync configmap cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.199801 4735 secret.go:188] Couldn't get secret openshift-dns-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: E1008 12:58:11.199980 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58d4a0d7-f90f-497b-8744-dac2ddd08868-metrics-tls podName:58d4a0d7-f90f-497b-8744-dac2ddd08868 nodeName:}" failed. No retries permitted until 2025-10-08 12:58:11.69995871 +0000 UTC m=+141.508263294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58d4a0d7-f90f-497b-8744-dac2ddd08868-metrics-tls") pod "dns-operator-744455d44c-cnm5q" (UID: "58d4a0d7-f90f-497b-8744-dac2ddd08868") : failed to sync secret cache: timed out waiting for the condition Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.204946 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.223304 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.243461 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.263446 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.284851 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.304411 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.324149 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.344524 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.363764 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.383980 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.422700 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.442809 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.463914 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.485037 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.503288 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.525126 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.544585 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.564190 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.582990 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.603722 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.623585 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.643515 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.683096 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.702954 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.718857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d24dc2f4-b906-49c0-be1e-f2326ef12664-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.719092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58d4a0d7-f90f-497b-8744-dac2ddd08868-metrics-tls\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.719199 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0759b5fe-0415-43f8-b773-99248d05630f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.719302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0759b5fe-0415-43f8-b773-99248d05630f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.719410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-serving-cert\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.719569 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-service-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.720123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-config\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.720069 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-etcd-service-ca\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.719901 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0759b5fe-0415-43f8-b773-99248d05630f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.720714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35d298-1364-4b8b-9a47-70fa26b2aa64-config\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.723301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58d4a0d7-f90f-497b-8744-dac2ddd08868-metrics-tls\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.723487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0759b5fe-0415-43f8-b773-99248d05630f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.723537 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.727234 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35d298-1364-4b8b-9a47-70fa26b2aa64-serving-cert\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.727243 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d24dc2f4-b906-49c0-be1e-f2326ef12664-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.744031 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.762909 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.784611 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.803697 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.823738 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.843431 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.863257 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.882586 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.903782 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.923358 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.963832 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhj9q\" (UniqueName: \"kubernetes.io/projected/ecbba0bb-2230-4624-8185-5d116632343f-kube-api-access-dhj9q\") pod \"console-f9d7485db-nnp5j\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.977984 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswl6\" (UniqueName: \"kubernetes.io/projected/c4031192-8128-46a2-a50b-e6cb49e95b9b-kube-api-access-jswl6\") pod \"authentication-operator-69f744f599-kkwtk\" (UID: \"c4031192-8128-46a2-a50b-e6cb49e95b9b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:11 crc kubenswrapper[4735]: I1008 12:58:11.996971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th4gr\" (UniqueName: \"kubernetes.io/projected/0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8-kube-api-access-th4gr\") pod \"openshift-apiserver-operator-796bbdcf4f-z6f2f\" (UID: \"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.014250 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.020554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79d2bd38-a2b9-4594-89bf-83c365f75f40-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.042283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mh76\" (UniqueName: \"kubernetes.io/projected/a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5-kube-api-access-6mh76\") pod \"machine-approver-56656f9798-wd7cr\" (UID: \"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.061552 4735 request.go:700] Waited for 1.954416156s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.064416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spz7k\" (UniqueName: \"kubernetes.io/projected/49e6e79f-93b3-453f-83d1-2afc40a4637c-kube-api-access-spz7k\") pod \"route-controller-manager-6576b87f9c-rgtx4\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.077347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxhvp\" (UniqueName: \"kubernetes.io/projected/99871c37-c1b7-48c4-bdec-d01234433109-kube-api-access-bxhvp\") pod \"machine-api-operator-5694c8668f-xkrqh\" (UID: \"99871c37-c1b7-48c4-bdec-d01234433109\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.082281 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.099217 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqqpn\" (UniqueName: \"kubernetes.io/projected/e2a1fa01-a8d6-421b-b66c-626360051888-kube-api-access-xqqpn\") pod \"olm-operator-6b444d44fb-wcmct\" (UID: \"e2a1fa01-a8d6-421b-b66c-626360051888\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.128748 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqdw\" (UniqueName: \"kubernetes.io/projected/25df5801-9a21-4396-8613-abddc02daf42-kube-api-access-qbqdw\") pod \"apiserver-7bbb656c7d-gq4dv\" (UID: \"25df5801-9a21-4396-8613-abddc02daf42\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.147653 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.148431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpzl\" (UniqueName: \"kubernetes.io/projected/6e891ada-e4a4-4e15-8ade-48d0f40518da-kube-api-access-grpzl\") pod \"oauth-openshift-558db77b4-xqfrr\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.159433 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.167320 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf88x\" (UniqueName: \"kubernetes.io/projected/15a08015-bcc0-4576-8f54-70e5693469a4-kube-api-access-cf88x\") pod \"controller-manager-879f6c89f-8rqp7\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.183307 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.184340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4pwd\" (UniqueName: \"kubernetes.io/projected/79d2bd38-a2b9-4594-89bf-83c365f75f40-kube-api-access-k4pwd\") pod \"ingress-operator-5b745b69d9-t8cv2\" (UID: \"79d2bd38-a2b9-4594-89bf-83c365f75f40\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.203436 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qswh\" (UniqueName: \"kubernetes.io/projected/ebf462d1-8994-47ec-b534-e6dc8696bbd0-kube-api-access-7qswh\") pod \"service-ca-9c57cc56f-bvv9p\" (UID: \"ebf462d1-8994-47ec-b534-e6dc8696bbd0\") " pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.220712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clzmm\" (UniqueName: \"kubernetes.io/projected/e4c4252c-e8f3-411c-8d61-9ed06ae9a5de-kube-api-access-clzmm\") pod \"package-server-manager-789f6589d5-lrcx8\" (UID: \"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.245798 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84042bd-cb46-444d-a3c9-ce1519dea75b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sjbv7\" (UID: \"f84042bd-cb46-444d-a3c9-ce1519dea75b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.261606 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kkwtk"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.269080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcfb\" (UniqueName: \"kubernetes.io/projected/e60e2365-ae6c-44a2-a7a3-a0c67f0d756a-kube-api-access-trcfb\") pod \"machine-config-operator-74547568cd-c4j7s\" (UID: \"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.274592 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.287680 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nnp5j"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.289597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5td7v\" (UniqueName: \"kubernetes.io/projected/d169069b-e5fe-4018-974d-97cf1c1cad23-kube-api-access-5td7v\") pod \"marketplace-operator-79b997595-gp925\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.296394 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.302250 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wkk\" (UniqueName: \"kubernetes.io/projected/0759b5fe-0415-43f8-b773-99248d05630f-kube-api-access-h2wkk\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmrzp\" (UID: \"0759b5fe-0415-43f8-b773-99248d05630f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.318922 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv5r2\" (UniqueName: \"kubernetes.io/projected/d95c4bbd-337f-454c-a57f-c47db8bc6ecb-kube-api-access-dv5r2\") pod \"packageserver-d55dfcdfc-cfmds\" (UID: \"d95c4bbd-337f-454c-a57f-c47db8bc6ecb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.336123 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.338893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8r6m\" (UniqueName: \"kubernetes.io/projected/c881e3aa-6bfe-476c-9e63-edfa5480745d-kube-api-access-q8r6m\") pod \"apiserver-76f77b778f-zxtmj\" (UID: \"c881e3aa-6bfe-476c-9e63-edfa5480745d\") " pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.340642 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.355376 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.360959 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7db24\" (UniqueName: \"kubernetes.io/projected/d24dc2f4-b906-49c0-be1e-f2326ef12664-kube-api-access-7db24\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.370602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.374036 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.379372 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.380259 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d24dc2f4-b906-49c0-be1e-f2326ef12664-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-75k9v\" (UID: \"d24dc2f4-b906-49c0-be1e-f2326ef12664\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.402875 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9q9\" (UniqueName: \"kubernetes.io/projected/58d4a0d7-f90f-497b-8744-dac2ddd08868-kube-api-access-md9q9\") pod \"dns-operator-744455d44c-cnm5q\" (UID: \"58d4a0d7-f90f-497b-8744-dac2ddd08868\") " pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.410117 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.413211 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.419253 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.422848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgcjb\" (UniqueName: \"kubernetes.io/projected/ec35d298-1364-4b8b-9a47-70fa26b2aa64-kube-api-access-qgcjb\") pod \"etcd-operator-b45778765-p5f5l\" (UID: \"ec35d298-1364-4b8b-9a47-70fa26b2aa64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.430076 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.438470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.439414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.440549 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f"] Oct 08 12:58:12 crc kubenswrapper[4735]: W1008 12:58:12.442261 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a1fa01_a8d6_421b_b66c_626360051888.slice/crio-6311e508d86805b825e8315f323f2d4245cc0f0ee9561e7953927bdcc3e0a00b WatchSource:0}: Error finding container 6311e508d86805b825e8315f323f2d4245cc0f0ee9561e7953927bdcc3e0a00b: Status 404 returned error can't find the container with id 6311e508d86805b825e8315f323f2d4245cc0f0ee9561e7953927bdcc3e0a00b Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.442774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhw57\" (UniqueName: \"kubernetes.io/projected/c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f-kube-api-access-dhw57\") pod \"router-default-5444994796-jgkq6\" (UID: \"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f\") " pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.457859 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.461118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqv22\" (UniqueName: \"kubernetes.io/projected/97f188e2-ee83-4fb4-b334-33c3099df3f2-kube-api-access-dqv22\") pod \"migrator-59844c95c7-qjb22\" (UID: \"97f188e2-ee83-4fb4-b334-33c3099df3f2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.468153 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.468464 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.468808 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.480693 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.533329 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.547673 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.551759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73926bea-3da3-4e00-a78c-b0c33b4b1919-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.551806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbts\" (UniqueName: \"kubernetes.io/projected/14f021da-25d5-4a75-82f3-b6c20c65dc7c-kube-api-access-2rbts\") pod \"downloads-7954f5f757-887mh\" (UID: \"14f021da-25d5-4a75-82f3-b6c20c65dc7c\") " pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.551853 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdf78d81-ae87-48c8-b189-eff27030dd91-profile-collector-cert\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.551879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477a3236-05fd-494b-8eb3-fd178d3180be-config\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.551900 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b266\" (UniqueName: \"kubernetes.io/projected/68f7f9bf-73a9-4109-a07c-e026b3d11513-kube-api-access-7b266\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554193 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1f96523-d3ab-4768-8de6-c901e255ae74-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdgv\" (UniqueName: \"kubernetes.io/projected/61708c63-4a5a-4e83-8243-02025b05df64-kube-api-access-hwdgv\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554263 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1f96523-d3ab-4768-8de6-c901e255ae74-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e100c510-2957-4946-b21d-e9227e32d7b0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dc7kk\" (UID: \"e100c510-2957-4946-b21d-e9227e32d7b0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554332 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-certificates\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554366 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83bd6935-d7c7-4548-add9-647cdfcd15ee-config\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554383 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hhf\" (UniqueName: \"kubernetes.io/projected/543b04b8-1dda-487e-9382-ba18d7be25e5-kube-api-access-k8hhf\") pod \"cluster-samples-operator-665b6dd947-7v5q4\" (UID: \"543b04b8-1dda-487e-9382-ba18d7be25e5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554415 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c787ebd-7a0e-40ae-b537-e09ae9f12578-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554469 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93cce0d5-b158-4619-95f3-eaa50b8d9af4-secret-volume\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554537 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxmjk\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-kube-api-access-hxmjk\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-bound-sa-token\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93cce0d5-b158-4619-95f3-eaa50b8d9af4-config-volume\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c787ebd-7a0e-40ae-b537-e09ae9f12578-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554869 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwm6x\" (UniqueName: \"kubernetes.io/projected/d316a645-bd82-4695-a79d-d31896b704eb-kube-api-access-gwm6x\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl48c\" (UniqueName: \"kubernetes.io/projected/e100c510-2957-4946-b21d-e9227e32d7b0-kube-api-access-dl48c\") pod \"multus-admission-controller-857f4d67dd-dc7kk\" (UID: \"e100c510-2957-4946-b21d-e9227e32d7b0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwdb\" (UniqueName: \"kubernetes.io/projected/fdf78d81-ae87-48c8-b189-eff27030dd91-kube-api-access-fqwdb\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.554961 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61708c63-4a5a-4e83-8243-02025b05df64-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555025 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555025 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73926bea-3da3-4e00-a78c-b0c33b4b1919-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/543b04b8-1dda-487e-9382-ba18d7be25e5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7v5q4\" (UID: \"543b04b8-1dda-487e-9382-ba18d7be25e5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555116 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfcbk\" (UniqueName: \"kubernetes.io/projected/bb722212-8f18-46b9-b5b5-484352c49bba-kube-api-access-lfcbk\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmvkr\" (UID: \"bb722212-8f18-46b9-b5b5-484352c49bba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83bd6935-d7c7-4548-add9-647cdfcd15ee-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555166 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvks\" (UniqueName: \"kubernetes.io/projected/477a3236-05fd-494b-8eb3-fd178d3180be-kube-api-access-7rvks\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555198 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdf78d81-ae87-48c8-b189-eff27030dd91-srv-cert\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c787ebd-7a0e-40ae-b537-e09ae9f12578-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f7f9bf-73a9-4109-a07c-e026b3d11513-serving-cert\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555367 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d316a645-bd82-4695-a79d-d31896b704eb-proxy-tls\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/477a3236-05fd-494b-8eb3-fd178d3180be-serving-cert\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb722212-8f18-46b9-b5b5-484352c49bba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmvkr\" (UID: \"bb722212-8f18-46b9-b5b5-484352c49bba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szl6r\" (UniqueName: \"kubernetes.io/projected/73926bea-3da3-4e00-a78c-b0c33b4b1919-kube-api-access-szl6r\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555500 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-tls\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f7f9bf-73a9-4109-a07c-e026b3d11513-config\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/477a3236-05fd-494b-8eb3-fd178d3180be-trusted-ca\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61708c63-4a5a-4e83-8243-02025b05df64-serving-cert\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555610 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83bd6935-d7c7-4548-add9-647cdfcd15ee-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d316a645-bd82-4695-a79d-d31896b704eb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555640 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-trusted-ca\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rdp\" (UniqueName: \"kubernetes.io/projected/93cce0d5-b158-4619-95f3-eaa50b8d9af4-kube-api-access-r9rdp\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.555688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: E1008 12:58:12.555957 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.055946789 +0000 UTC m=+142.864251333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.568697 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.588773 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" Oct 08 12:58:12 crc kubenswrapper[4735]: W1008 12:58:12.630646 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c4252c_e8f3_411c_8d61_9ed06ae9a5de.slice/crio-5517defe693638f633fb8bd8a308e6d9d0045babf7be21e926ac98b6fd682881 WatchSource:0}: Error finding container 5517defe693638f633fb8bd8a308e6d9d0045babf7be21e926ac98b6fd682881: Status 404 returned error can't find the container with id 5517defe693638f633fb8bd8a308e6d9d0045babf7be21e926ac98b6fd682881 Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.656901 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c787ebd-7a0e-40ae-b537-e09ae9f12578-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f7f9bf-73a9-4109-a07c-e026b3d11513-serving-cert\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d316a645-bd82-4695-a79d-d31896b704eb-proxy-tls\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spc92\" (UniqueName: \"kubernetes.io/projected/87ee9fbe-46aa-4075-b289-ef083cea461b-kube-api-access-spc92\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657193 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpb8f\" (UniqueName: \"kubernetes.io/projected/2bf3f935-07ed-496e-8426-da64835d0c65-kube-api-access-gpb8f\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/477a3236-05fd-494b-8eb3-fd178d3180be-serving-cert\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb722212-8f18-46b9-b5b5-484352c49bba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmvkr\" (UID: \"bb722212-8f18-46b9-b5b5-484352c49bba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szl6r\" (UniqueName: \"kubernetes.io/projected/73926bea-3da3-4e00-a78c-b0c33b4b1919-kube-api-access-szl6r\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657320 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-tls\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657345 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-plugins-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f7f9bf-73a9-4109-a07c-e026b3d11513-config\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657401 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/477a3236-05fd-494b-8eb3-fd178d3180be-trusted-ca\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657417 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61708c63-4a5a-4e83-8243-02025b05df64-serving-cert\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657432 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83bd6935-d7c7-4548-add9-647cdfcd15ee-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657446 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d316a645-bd82-4695-a79d-d31896b704eb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-trusted-ca\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-registration-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657528 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-metrics-tls\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657564 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rdp\" (UniqueName: \"kubernetes.io/projected/93cce0d5-b158-4619-95f3-eaa50b8d9af4-kube-api-access-r9rdp\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73926bea-3da3-4e00-a78c-b0c33b4b1919-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbts\" (UniqueName: \"kubernetes.io/projected/14f021da-25d5-4a75-82f3-b6c20c65dc7c-kube-api-access-2rbts\") pod \"downloads-7954f5f757-887mh\" (UID: \"14f021da-25d5-4a75-82f3-b6c20c65dc7c\") " pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657661 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdf78d81-ae87-48c8-b189-eff27030dd91-profile-collector-cert\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-mountpoint-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477a3236-05fd-494b-8eb3-fd178d3180be-config\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b266\" (UniqueName: \"kubernetes.io/projected/68f7f9bf-73a9-4109-a07c-e026b3d11513-kube-api-access-7b266\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1f96523-d3ab-4768-8de6-c901e255ae74-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdgv\" (UniqueName: \"kubernetes.io/projected/61708c63-4a5a-4e83-8243-02025b05df64-kube-api-access-hwdgv\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1f96523-d3ab-4768-8de6-c901e255ae74-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e100c510-2957-4946-b21d-e9227e32d7b0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dc7kk\" (UID: \"e100c510-2957-4946-b21d-e9227e32d7b0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.657977 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-config-volume\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-certificates\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658060 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83bd6935-d7c7-4548-add9-647cdfcd15ee-config\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hhf\" (UniqueName: \"kubernetes.io/projected/543b04b8-1dda-487e-9382-ba18d7be25e5-kube-api-access-k8hhf\") pod \"cluster-samples-operator-665b6dd947-7v5q4\" (UID: \"543b04b8-1dda-487e-9382-ba18d7be25e5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658095 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2bf3f935-07ed-496e-8426-da64835d0c65-certs\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c787ebd-7a0e-40ae-b537-e09ae9f12578-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658176 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93cce0d5-b158-4619-95f3-eaa50b8d9af4-secret-volume\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxmjk\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-kube-api-access-hxmjk\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658241 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-bound-sa-token\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93cce0d5-b158-4619-95f3-eaa50b8d9af4-config-volume\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c787ebd-7a0e-40ae-b537-e09ae9f12578-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658385 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-socket-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-csi-data-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm6x\" (UniqueName: \"kubernetes.io/projected/d316a645-bd82-4695-a79d-d31896b704eb-kube-api-access-gwm6x\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl48c\" (UniqueName: \"kubernetes.io/projected/e100c510-2957-4946-b21d-e9227e32d7b0-kube-api-access-dl48c\") pod \"multus-admission-controller-857f4d67dd-dc7kk\" (UID: \"e100c510-2957-4946-b21d-e9227e32d7b0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658526 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25j7\" (UniqueName: \"kubernetes.io/projected/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-kube-api-access-m25j7\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwdb\" (UniqueName: \"kubernetes.io/projected/fdf78d81-ae87-48c8-b189-eff27030dd91-kube-api-access-fqwdb\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61708c63-4a5a-4e83-8243-02025b05df64-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658634 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e492bc45-cd6d-4298-b279-0c5f82f91e8c-cert\") pod \"ingress-canary-lshv7\" (UID: \"e492bc45-cd6d-4298-b279-0c5f82f91e8c\") " pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73926bea-3da3-4e00-a78c-b0c33b4b1919-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2bf3f935-07ed-496e-8426-da64835d0c65-node-bootstrap-token\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658737 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/543b04b8-1dda-487e-9382-ba18d7be25e5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7v5q4\" (UID: \"543b04b8-1dda-487e-9382-ba18d7be25e5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfcbk\" (UniqueName: \"kubernetes.io/projected/bb722212-8f18-46b9-b5b5-484352c49bba-kube-api-access-lfcbk\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmvkr\" (UID: \"bb722212-8f18-46b9-b5b5-484352c49bba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4lv\" (UniqueName: \"kubernetes.io/projected/e492bc45-cd6d-4298-b279-0c5f82f91e8c-kube-api-access-4w4lv\") pod \"ingress-canary-lshv7\" (UID: \"e492bc45-cd6d-4298-b279-0c5f82f91e8c\") " pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83bd6935-d7c7-4548-add9-647cdfcd15ee-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.658950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvks\" (UniqueName: \"kubernetes.io/projected/477a3236-05fd-494b-8eb3-fd178d3180be-kube-api-access-7rvks\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: E1008 12:58:12.658986 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.158956609 +0000 UTC m=+142.967261193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.659064 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdf78d81-ae87-48c8-b189-eff27030dd91-srv-cert\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.662922 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1f96523-d3ab-4768-8de6-c901e255ae74-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.663077 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477a3236-05fd-494b-8eb3-fd178d3180be-config\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.663395 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c787ebd-7a0e-40ae-b537-e09ae9f12578-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.663597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93cce0d5-b158-4619-95f3-eaa50b8d9af4-config-volume\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.673028 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61708c63-4a5a-4e83-8243-02025b05df64-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.680534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1f96523-d3ab-4768-8de6-c901e255ae74-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.680973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73926bea-3da3-4e00-a78c-b0c33b4b1919-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.683963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d316a645-bd82-4695-a79d-d31896b704eb-proxy-tls\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.684589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/477a3236-05fd-494b-8eb3-fd178d3180be-trusted-ca\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.684758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83bd6935-d7c7-4548-add9-647cdfcd15ee-config\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.685129 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93cce0d5-b158-4619-95f3-eaa50b8d9af4-secret-volume\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.690856 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f7f9bf-73a9-4109-a07c-e026b3d11513-config\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.691643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdf78d81-ae87-48c8-b189-eff27030dd91-profile-collector-cert\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.691684 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-certificates\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.692231 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/477a3236-05fd-494b-8eb3-fd178d3180be-serving-cert\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.692503 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e100c510-2957-4946-b21d-e9227e32d7b0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dc7kk\" (UID: \"e100c510-2957-4946-b21d-e9227e32d7b0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.692904 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f7f9bf-73a9-4109-a07c-e026b3d11513-serving-cert\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.694110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d316a645-bd82-4695-a79d-d31896b704eb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.697233 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73926bea-3da3-4e00-a78c-b0c33b4b1919-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.698115 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61708c63-4a5a-4e83-8243-02025b05df64-serving-cert\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.709106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" event={"ID":"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5","Type":"ContainerStarted","Data":"3cdf028af8a6fddfb6fcd614dee4e3e7a7c7e74334fea9e6c3046da54538f67e"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.709634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c787ebd-7a0e-40ae-b537-e09ae9f12578-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.710922 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-trusted-ca\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.710963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83bd6935-d7c7-4548-add9-647cdfcd15ee-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.711318 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/543b04b8-1dda-487e-9382-ba18d7be25e5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7v5q4\" (UID: \"543b04b8-1dda-487e-9382-ba18d7be25e5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.711366 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-tls\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.712646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" event={"ID":"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8","Type":"ContainerStarted","Data":"8b55779b6b4012abd4db3d3531faa01ea5ff6ac226a4432d451f46480b99cfa9"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.713917 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdf78d81-ae87-48c8-b189-eff27030dd91-srv-cert\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.716415 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bvv9p"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.716677 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb722212-8f18-46b9-b5b5-484352c49bba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmvkr\" (UID: \"bb722212-8f18-46b9-b5b5-484352c49bba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.719074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnp5j" event={"ID":"ecbba0bb-2230-4624-8185-5d116632343f","Type":"ContainerStarted","Data":"62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.719119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnp5j" event={"ID":"ecbba0bb-2230-4624-8185-5d116632343f","Type":"ContainerStarted","Data":"dd61ad4d430fee166999d1361e0bd2ee2bda1339e6bdde16ca238a95cc1d9548"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.721716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" event={"ID":"25df5801-9a21-4396-8613-abddc02daf42","Type":"ContainerStarted","Data":"f196f38c5f4f14c53ae67cb6a451d884c45fbe4041aa139d2f8478c4e72291e1"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.725426 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" event={"ID":"c4031192-8128-46a2-a50b-e6cb49e95b9b","Type":"ContainerStarted","Data":"74199ca3de5de003dfe28eed2f3460ea9cef259d8ebee62369b8921bf28dafac"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.725667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" event={"ID":"c4031192-8128-46a2-a50b-e6cb49e95b9b","Type":"ContainerStarted","Data":"846f45fc3b204b014a2532f84e08abae311b222a5f274ca781af0ff660cd16d2"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.726951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b266\" (UniqueName: \"kubernetes.io/projected/68f7f9bf-73a9-4109-a07c-e026b3d11513-kube-api-access-7b266\") pod \"service-ca-operator-777779d784-fxkkf\" (UID: \"68f7f9bf-73a9-4109-a07c-e026b3d11513\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.729417 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" event={"ID":"e2a1fa01-a8d6-421b-b66c-626360051888","Type":"ContainerStarted","Data":"6311e508d86805b825e8315f323f2d4245cc0f0ee9561e7953927bdcc3e0a00b"} Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.740423 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvks\" (UniqueName: \"kubernetes.io/projected/477a3236-05fd-494b-8eb3-fd178d3180be-kube-api-access-7rvks\") pod \"console-operator-58897d9998-bqm6j\" (UID: \"477a3236-05fd-494b-8eb3-fd178d3180be\") " pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.743949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm6x\" (UniqueName: \"kubernetes.io/projected/d316a645-bd82-4695-a79d-d31896b704eb-kube-api-access-gwm6x\") pod \"machine-config-controller-84d6567774-gcn29\" (UID: \"d316a645-bd82-4695-a79d-d31896b704eb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.761973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-plugins-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-registration-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-metrics-tls\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762103 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-mountpoint-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-config-volume\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762191 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-plugins-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2bf3f935-07ed-496e-8426-da64835d0c65-certs\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-socket-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-csi-data-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762297 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-registration-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25j7\" (UniqueName: \"kubernetes.io/projected/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-kube-api-access-m25j7\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762381 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e492bc45-cd6d-4298-b279-0c5f82f91e8c-cert\") pod \"ingress-canary-lshv7\" (UID: \"e492bc45-cd6d-4298-b279-0c5f82f91e8c\") " pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2bf3f935-07ed-496e-8426-da64835d0c65-node-bootstrap-token\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762448 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4lv\" (UniqueName: \"kubernetes.io/projected/e492bc45-cd6d-4298-b279-0c5f82f91e8c-kube-api-access-4w4lv\") pod \"ingress-canary-lshv7\" (UID: \"e492bc45-cd6d-4298-b279-0c5f82f91e8c\") " pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:12 crc kubenswrapper[4735]: E1008 12:58:12.762505 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.262470913 +0000 UTC m=+143.070775537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-mountpoint-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spc92\" (UniqueName: \"kubernetes.io/projected/87ee9fbe-46aa-4075-b289-ef083cea461b-kube-api-access-spc92\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.762674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpb8f\" (UniqueName: \"kubernetes.io/projected/2bf3f935-07ed-496e-8426-da64835d0c65-kube-api-access-gpb8f\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.763499 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-config-volume\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.764921 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-socket-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.764956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/87ee9fbe-46aa-4075-b289-ef083cea461b-csi-data-dir\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.766568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2bf3f935-07ed-496e-8426-da64835d0c65-node-bootstrap-token\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.768869 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-bound-sa-token\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.778218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-metrics-tls\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.778412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e492bc45-cd6d-4298-b279-0c5f82f91e8c-cert\") pod \"ingress-canary-lshv7\" (UID: \"e492bc45-cd6d-4298-b279-0c5f82f91e8c\") " pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.778860 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2bf3f935-07ed-496e-8426-da64835d0c65-certs\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.781510 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl48c\" (UniqueName: \"kubernetes.io/projected/e100c510-2957-4946-b21d-e9227e32d7b0-kube-api-access-dl48c\") pod \"multus-admission-controller-857f4d67dd-dc7kk\" (UID: \"e100c510-2957-4946-b21d-e9227e32d7b0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.797733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwdb\" (UniqueName: \"kubernetes.io/projected/fdf78d81-ae87-48c8-b189-eff27030dd91-kube-api-access-fqwdb\") pod \"catalog-operator-68c6474976-bw544\" (UID: \"fdf78d81-ae87-48c8-b189-eff27030dd91\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: W1008 12:58:12.801013 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf462d1_8994_47ec_b534_e6dc8696bbd0.slice/crio-91d27a063fea1ecbeb3aeb49790eeb923f514198d0d84c3aeea8eaa418ddbfed WatchSource:0}: Error finding container 91d27a063fea1ecbeb3aeb49790eeb923f514198d0d84c3aeea8eaa418ddbfed: Status 404 returned error can't find the container with id 91d27a063fea1ecbeb3aeb49790eeb923f514198d0d84c3aeea8eaa418ddbfed Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.820562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hhf\" (UniqueName: \"kubernetes.io/projected/543b04b8-1dda-487e-9382-ba18d7be25e5-kube-api-access-k8hhf\") pod \"cluster-samples-operator-665b6dd947-7v5q4\" (UID: \"543b04b8-1dda-487e-9382-ba18d7be25e5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.841303 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfcbk\" (UniqueName: \"kubernetes.io/projected/bb722212-8f18-46b9-b5b5-484352c49bba-kube-api-access-lfcbk\") pod \"control-plane-machine-set-operator-78cbb6b69f-zmvkr\" (UID: \"bb722212-8f18-46b9-b5b5-484352c49bba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.854130 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.860122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83bd6935-d7c7-4548-add9-647cdfcd15ee-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xjxt2\" (UID: \"83bd6935-d7c7-4548-add9-647cdfcd15ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.862384 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.863005 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:12 crc kubenswrapper[4735]: E1008 12:58:12.863998 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.363984375 +0000 UTC m=+143.172288919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.885422 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.889213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c787ebd-7a0e-40ae-b537-e09ae9f12578-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-djbhr\" (UID: \"9c787ebd-7a0e-40ae-b537-e09ae9f12578\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.915404 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szl6r\" (UniqueName: \"kubernetes.io/projected/73926bea-3da3-4e00-a78c-b0c33b4b1919-kube-api-access-szl6r\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vdgs\" (UID: \"73926bea-3da3-4e00-a78c-b0c33b4b1919\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.923561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.925624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdgv\" (UniqueName: \"kubernetes.io/projected/61708c63-4a5a-4e83-8243-02025b05df64-kube-api-access-hwdgv\") pod \"openshift-config-operator-7777fb866f-jnr7v\" (UID: \"61708c63-4a5a-4e83-8243-02025b05df64\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.933083 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.936545 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.939533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxmjk\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-kube-api-access-hxmjk\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.956512 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.964157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.964242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbts\" (UniqueName: \"kubernetes.io/projected/14f021da-25d5-4a75-82f3-b6c20c65dc7c-kube-api-access-2rbts\") pod \"downloads-7954f5f757-887mh\" (UID: \"14f021da-25d5-4a75-82f3-b6c20c65dc7c\") " pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:12 crc kubenswrapper[4735]: E1008 12:58:12.964466 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.46445212 +0000 UTC m=+143.272756714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.980317 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xkrqh"] Oct 08 12:58:12 crc kubenswrapper[4735]: I1008 12:58:12.989869 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.006053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rdp\" (UniqueName: \"kubernetes.io/projected/93cce0d5-b158-4619-95f3-eaa50b8d9af4-kube-api-access-r9rdp\") pod \"collect-profiles-29332125-6dxj9\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.006548 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.020224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25j7\" (UniqueName: \"kubernetes.io/projected/0b7e3bce-3aca-4e94-8d98-11009fc0ab3e-kube-api-access-m25j7\") pod \"dns-default-csnmr\" (UID: \"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e\") " pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.038396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpb8f\" (UniqueName: \"kubernetes.io/projected/2bf3f935-07ed-496e-8426-da64835d0c65-kube-api-access-gpb8f\") pod \"machine-config-server-7nmkc\" (UID: \"2bf3f935-07ed-496e-8426-da64835d0c65\") " pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.049933 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.065003 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.065324 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.565308105 +0000 UTC m=+143.373612649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.067114 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4lv\" (UniqueName: \"kubernetes.io/projected/e492bc45-cd6d-4298-b279-0c5f82f91e8c-kube-api-access-4w4lv\") pod \"ingress-canary-lshv7\" (UID: \"e492bc45-cd6d-4298-b279-0c5f82f91e8c\") " pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.080660 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.083596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spc92\" (UniqueName: \"kubernetes.io/projected/87ee9fbe-46aa-4075-b289-ef083cea461b-kube-api-access-spc92\") pod \"csi-hostpathplugin-4drk9\" (UID: \"87ee9fbe-46aa-4075-b289-ef083cea461b\") " pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.091839 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.099788 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.108172 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lshv7" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.114507 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7nmkc" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.146696 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gp925"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.152155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rqp7"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.164965 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5f5l"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.166305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.166666 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.666654902 +0000 UTC m=+143.474959446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: W1008 12:58:13.189147 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd169069b_e5fe_4018_974d_97cf1c1cad23.slice/crio-daf0b66a40ab41c82388a090a16eb5de0ff0011b36534ed88f5d3b119e67464b WatchSource:0}: Error finding container daf0b66a40ab41c82388a090a16eb5de0ff0011b36534ed88f5d3b119e67464b: Status 404 returned error can't find the container with id daf0b66a40ab41c82388a090a16eb5de0ff0011b36534ed88f5d3b119e67464b Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.197189 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.264243 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.268081 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.268464 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.768449341 +0000 UTC m=+143.576753875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: W1008 12:58:13.272978 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec35d298_1364_4b8b_9a47_70fa26b2aa64.slice/crio-88c70020ebe39b94e4a1b2693cb602e90ba3be443258ed55ff27d8216adc9993 WatchSource:0}: Error finding container 88c70020ebe39b94e4a1b2693cb602e90ba3be443258ed55ff27d8216adc9993: Status 404 returned error can't find the container with id 88c70020ebe39b94e4a1b2693cb602e90ba3be443258ed55ff27d8216adc9993 Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.298168 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.318207 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxtmj"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.341199 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqfrr"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.371194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.371498 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.871473592 +0000 UTC m=+143.679778136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.426447 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cnm5q"] Oct 08 12:58:13 crc kubenswrapper[4735]: W1008 12:58:13.469644 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc881e3aa_6bfe_476c_9e63_edfa5480745d.slice/crio-27981e0f9b6ee8589dd5c41cd569e9cb96adb9b1b194c14529ea6d1a71e81f3f WatchSource:0}: Error finding container 27981e0f9b6ee8589dd5c41cd569e9cb96adb9b1b194c14529ea6d1a71e81f3f: Status 404 returned error can't find the container with id 27981e0f9b6ee8589dd5c41cd569e9cb96adb9b1b194c14529ea6d1a71e81f3f Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.474832 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.475880 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:13.975853807 +0000 UTC m=+143.784158351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.556597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.573322 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.575868 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.576774 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.585884 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.085856717 +0000 UTC m=+143.894161261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.602584 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.606548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.608774 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4"] Oct 08 12:58:13 crc kubenswrapper[4735]: W1008 12:58:13.667372 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60e2365_ae6c_44a2_a7a3_a0c67f0d756a.slice/crio-d392edab53f9406478486105322686898ec207f9d067dd113167dab4c5ff42ef WatchSource:0}: Error finding container d392edab53f9406478486105322686898ec207f9d067dd113167dab4c5ff42ef: Status 404 returned error can't find the container with id d392edab53f9406478486105322686898ec207f9d067dd113167dab4c5ff42ef Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.682243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.685651 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.185615454 +0000 UTC m=+143.993919998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: W1008 12:58:13.697051 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24dc2f4_b906_49c0_be1e_f2326ef12664.slice/crio-7efddad665a72afb2dbf9d6f6aeac3311ae4e7af81e92938e4241faf7b39248d WatchSource:0}: Error finding container 7efddad665a72afb2dbf9d6f6aeac3311ae4e7af81e92938e4241faf7b39248d: Status 404 returned error can't find the container with id 7efddad665a72afb2dbf9d6f6aeac3311ae4e7af81e92938e4241faf7b39248d Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.784315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.784666 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.284653582 +0000 UTC m=+144.092958126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.785603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" event={"ID":"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5","Type":"ContainerStarted","Data":"9c7a8854009eb1b7779ce0893e7a45f0cfc5777132ce8d8924d5c8fbe2f2fa8c"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.795842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" event={"ID":"0a1a9703-af76-4f95-bb2a-aa4ac6ba6fd8","Type":"ContainerStarted","Data":"7d99208bd9e35c30074e950afe379f700a5cab426e1714e868eab09874df5f21"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.822044 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" event={"ID":"97f188e2-ee83-4fb4-b334-33c3099df3f2","Type":"ContainerStarted","Data":"96a0a326a8cde269d30ca984e3b7215f3aaac2670d49ebd4a7aef72b9390e8a7"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.826394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" event={"ID":"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de","Type":"ContainerStarted","Data":"de0f4bf36895e86fb432532a179659cdaff419948dbe2c45c21a84003182bf12"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.826441 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" event={"ID":"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de","Type":"ContainerStarted","Data":"5517defe693638f633fb8bd8a308e6d9d0045babf7be21e926ac98b6fd682881"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.828001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" event={"ID":"c881e3aa-6bfe-476c-9e63-edfa5480745d","Type":"ContainerStarted","Data":"27981e0f9b6ee8589dd5c41cd569e9cb96adb9b1b194c14529ea6d1a71e81f3f"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.831061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7nmkc" event={"ID":"2bf3f935-07ed-496e-8426-da64835d0c65","Type":"ContainerStarted","Data":"96ca5a09784e7964eb268a15512d8488a70c14b5154cc0aaf08c7a247a93d4a5"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.832789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" event={"ID":"d169069b-e5fe-4018-974d-97cf1c1cad23","Type":"ContainerStarted","Data":"daf0b66a40ab41c82388a090a16eb5de0ff0011b36534ed88f5d3b119e67464b"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.835788 4735 generic.go:334] "Generic (PLEG): container finished" podID="25df5801-9a21-4396-8613-abddc02daf42" containerID="771b00a5bb54bc597f5c91b3d95c1f0459bdcb849a86e94c9a1cb3ea9fdc8645" exitCode=0 Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.836339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" event={"ID":"25df5801-9a21-4396-8613-abddc02daf42","Type":"ContainerDied","Data":"771b00a5bb54bc597f5c91b3d95c1f0459bdcb849a86e94c9a1cb3ea9fdc8645"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.844933 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" event={"ID":"e2a1fa01-a8d6-421b-b66c-626360051888","Type":"ContainerStarted","Data":"0875e803f68fd0d642c01d9428d593570428ea451a3a04e2d3a91d31f710b4c5"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.845408 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.846748 4735 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wcmct container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.846783 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" podUID="e2a1fa01-a8d6-421b-b66c-626360051888" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.847190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jgkq6" event={"ID":"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f","Type":"ContainerStarted","Data":"a7c4545c3cb3e94b818fc95bc380f857ce54fed0949849e8ed683d77d6ed3580"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.847214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jgkq6" event={"ID":"c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f","Type":"ContainerStarted","Data":"f48a4028fae1dc0243c1032c27e904324a20705951a9876f941b7214fe8226a2"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.853233 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" event={"ID":"6e891ada-e4a4-4e15-8ade-48d0f40518da","Type":"ContainerStarted","Data":"f3c83d66d26fcbff5e4faeb9c0129a6df409b19c18df7cca4d2d62b4b0e47fde"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.857638 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" event={"ID":"ec35d298-1364-4b8b-9a47-70fa26b2aa64","Type":"ContainerStarted","Data":"88c70020ebe39b94e4a1b2693cb602e90ba3be443258ed55ff27d8216adc9993"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.858703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" event={"ID":"d24dc2f4-b906-49c0-be1e-f2326ef12664","Type":"ContainerStarted","Data":"7efddad665a72afb2dbf9d6f6aeac3311ae4e7af81e92938e4241faf7b39248d"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.859576 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" event={"ID":"79d2bd38-a2b9-4594-89bf-83c365f75f40","Type":"ContainerStarted","Data":"d40933ce52041334f203623b68588b55ec7c28e00547d7d10077e8ee6c65b36b"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.860306 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" event={"ID":"49e6e79f-93b3-453f-83d1-2afc40a4637c","Type":"ContainerStarted","Data":"c416ddbd68459bea0c1a5952177059ec088806ee43e6f1eefa8bd94d8e1c04f3"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.861696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" event={"ID":"d95c4bbd-337f-454c-a57f-c47db8bc6ecb","Type":"ContainerStarted","Data":"7997d42133144d9367ad09a804f8e3a3dd43b6ed4fa4f3acee3666ada0e80863"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.862948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" event={"ID":"58d4a0d7-f90f-497b-8744-dac2ddd08868","Type":"ContainerStarted","Data":"31944be8b501998959b997199cac08ef1d7f1c06568f6e93f5b872bbc7c19367"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.867341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" event={"ID":"ebf462d1-8994-47ec-b534-e6dc8696bbd0","Type":"ContainerStarted","Data":"5912aa4a9e0305e7ed0bd94689f70032beb1f1c6b8951fed129114437852dcf0"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.867367 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" event={"ID":"ebf462d1-8994-47ec-b534-e6dc8696bbd0","Type":"ContainerStarted","Data":"91d27a063fea1ecbeb3aeb49790eeb923f514198d0d84c3aeea8eaa418ddbfed"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.870833 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" event={"ID":"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a","Type":"ContainerStarted","Data":"d392edab53f9406478486105322686898ec207f9d067dd113167dab4c5ff42ef"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.874834 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" event={"ID":"99871c37-c1b7-48c4-bdec-d01234433109","Type":"ContainerStarted","Data":"b4dcc48c48d93e91509c3480e16891dcca93f27252d232c0b27d6f8c241f321d"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.876320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" event={"ID":"15a08015-bcc0-4576-8f54-70e5693469a4","Type":"ContainerStarted","Data":"23d4121b6bd6638ba3fe161ed79d7545652bbec07e722d15a16b1b8fb4f83c37"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.885891 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.886267 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.386243256 +0000 UTC m=+144.194547800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.894040 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" event={"ID":"f84042bd-cb46-444d-a3c9-ce1519dea75b","Type":"ContainerStarted","Data":"caaa0cf3be52f0f8e46f63dfc1cdbdfe5fb9b60d69ae11cefae28ea9b90bc68f"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.894080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" event={"ID":"f84042bd-cb46-444d-a3c9-ce1519dea75b","Type":"ContainerStarted","Data":"12c4eda222c72e08efefb9a3f5bf7504f6a376e66d8687364761915374a1a24c"} Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.974156 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.977273 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.980274 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bqm6j"] Oct 08 12:58:13 crc kubenswrapper[4735]: I1008 12:58:13.990368 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:13 crc kubenswrapper[4735]: E1008 12:58:13.994179 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.494160003 +0000 UTC m=+144.302464547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.000425 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f7f9bf_73a9_4109_a07c_e026b3d11513.slice/crio-86ceb7a725816f4beb908544dec8cc0c5cbe42340a1c93eec5b291a65a637a97 WatchSource:0}: Error finding container 86ceb7a725816f4beb908544dec8cc0c5cbe42340a1c93eec5b291a65a637a97: Status 404 returned error can't find the container with id 86ceb7a725816f4beb908544dec8cc0c5cbe42340a1c93eec5b291a65a637a97 Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.065688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.095387 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.095654 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.595632343 +0000 UTC m=+144.403936887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.095805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.096075 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.596062284 +0000 UTC m=+144.404366828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.150959 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lshv7"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.169900 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.189632 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.189702 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v"] Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.199412 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477a3236_05fd_494b_8eb3_fd178d3180be.slice/crio-aad4f93dcaeac93404012eee115f06e393c368a4d27f92481cd57224fd0ba5dd WatchSource:0}: Error finding container aad4f93dcaeac93404012eee115f06e393c368a4d27f92481cd57224fd0ba5dd: Status 404 returned error can't find the container with id aad4f93dcaeac93404012eee115f06e393c368a4d27f92481cd57224fd0ba5dd Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.199456 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.199946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.200213 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.700201184 +0000 UTC m=+144.508505718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.223546 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544"] Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.250717 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73926bea_3da3_4e00_a78c_b0c33b4b1919.slice/crio-f3b83b75f02cb9a2b0f9759cf83c98a225b44ea0f7192a7d64023fb3bcad8f7e WatchSource:0}: Error finding container f3b83b75f02cb9a2b0f9759cf83c98a225b44ea0f7192a7d64023fb3bcad8f7e: Status 404 returned error can't find the container with id f3b83b75f02cb9a2b0f9759cf83c98a225b44ea0f7192a7d64023fb3bcad8f7e Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.278075 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4drk9"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.280409 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-887mh"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.282257 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.285648 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dc7kk"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.289171 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-csnmr"] Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.295892 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf78d81_ae87_48c8_b189_eff27030dd91.slice/crio-47c21c80848762dbd03a8932502cbf0c44afc1cffee6ae64c32728593f0c13e9 WatchSource:0}: Error finding container 47c21c80848762dbd03a8932502cbf0c44afc1cffee6ae64c32728593f0c13e9: Status 404 returned error can't find the container with id 47c21c80848762dbd03a8932502cbf0c44afc1cffee6ae64c32728593f0c13e9 Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.298104 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c787ebd_7a0e_40ae_b537_e09ae9f12578.slice/crio-5b1631b64e06a0b739150b05512242908596867705fad418204d22cda7717b01 WatchSource:0}: Error finding container 5b1631b64e06a0b739150b05512242908596867705fad418204d22cda7717b01: Status 404 returned error can't find the container with id 5b1631b64e06a0b739150b05512242908596867705fad418204d22cda7717b01 Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.305991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.306298 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.806278843 +0000 UTC m=+144.614583377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.312964 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9"] Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.337921 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nnp5j" podStartSLOduration=123.337905617 podStartE2EDuration="2m3.337905617s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.337399774 +0000 UTC m=+144.145704338" watchObservedRunningTime="2025-10-08 12:58:14.337905617 +0000 UTC m=+144.146210161" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.374061 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sjbv7" podStartSLOduration=122.374047087 podStartE2EDuration="2m2.374047087s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.372830376 +0000 UTC m=+144.181134920" watchObservedRunningTime="2025-10-08 12:58:14.374047087 +0000 UTC m=+144.182351631" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.406854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.406989 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.906971264 +0000 UTC m=+144.715275808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.407132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.407387 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:14.907378574 +0000 UTC m=+144.715683118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.413238 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" podStartSLOduration=122.413209595 podStartE2EDuration="2m2.413209595s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.412450495 +0000 UTC m=+144.220755039" watchObservedRunningTime="2025-10-08 12:58:14.413209595 +0000 UTC m=+144.221514139" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.507862 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.507985 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.007960962 +0000 UTC m=+144.816265506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.508179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.508441 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.008433305 +0000 UTC m=+144.816737849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.527673 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode492bc45_cd6d_4298_b279_0c5f82f91e8c.slice/crio-7e4365616bef66e7edca447a0175622d85d4811845b00ca79f5fb84f4d931803 WatchSource:0}: Error finding container 7e4365616bef66e7edca447a0175622d85d4811845b00ca79f5fb84f4d931803: Status 404 returned error can't find the container with id 7e4365616bef66e7edca447a0175622d85d4811845b00ca79f5fb84f4d931803 Oct 08 12:58:14 crc kubenswrapper[4735]: W1008 12:58:14.555181 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93cce0d5_b158_4619_95f3_eaa50b8d9af4.slice/crio-20cc7e9239d8b1dd8498e5eb59fc81456f5e70be7f7e1bc377442d076aedb434 WatchSource:0}: Error finding container 20cc7e9239d8b1dd8498e5eb59fc81456f5e70be7f7e1bc377442d076aedb434: Status 404 returned error can't find the container with id 20cc7e9239d8b1dd8498e5eb59fc81456f5e70be7f7e1bc377442d076aedb434 Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.570012 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.589186 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:14 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:14 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:14 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.589265 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.608974 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.609530 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.109470004 +0000 UTC m=+144.917774548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.673445 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkwtk" podStartSLOduration=123.6734272 podStartE2EDuration="2m3.6734272s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.670892575 +0000 UTC m=+144.479197119" watchObservedRunningTime="2025-10-08 12:58:14.6734272 +0000 UTC m=+144.481731744" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.726217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.726843 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.226824554 +0000 UTC m=+145.035129098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.737131 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-bvv9p" podStartSLOduration=122.737111808 podStartE2EDuration="2m2.737111808s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.736421761 +0000 UTC m=+144.544726315" watchObservedRunningTime="2025-10-08 12:58:14.737111808 +0000 UTC m=+144.545416352" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.745181 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jgkq6" podStartSLOduration=122.745156115 podStartE2EDuration="2m2.745156115s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.698931286 +0000 UTC m=+144.507235840" watchObservedRunningTime="2025-10-08 12:58:14.745156115 +0000 UTC m=+144.553460669" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.826760 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.827055 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.327042432 +0000 UTC m=+145.135346976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.863111 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z6f2f" podStartSLOduration=123.86309027 podStartE2EDuration="2m3.86309027s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:14.823927782 +0000 UTC m=+144.632232336" watchObservedRunningTime="2025-10-08 12:58:14.86309027 +0000 UTC m=+144.671394814" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.923736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" event={"ID":"9c787ebd-7a0e-40ae-b537-e09ae9f12578","Type":"ContainerStarted","Data":"5b1631b64e06a0b739150b05512242908596867705fad418204d22cda7717b01"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.928636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:14 crc kubenswrapper[4735]: E1008 12:58:14.929077 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.429062347 +0000 UTC m=+145.237366881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.935804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" event={"ID":"73926bea-3da3-4e00-a78c-b0c33b4b1919","Type":"ContainerStarted","Data":"f3b83b75f02cb9a2b0f9759cf83c98a225b44ea0f7192a7d64023fb3bcad8f7e"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.944730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" event={"ID":"83bd6935-d7c7-4548-add9-647cdfcd15ee","Type":"ContainerStarted","Data":"7a046a084f124461231d20c0407b4950e223e789e34bb61ec3e594bac4e5e1eb"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.946421 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" event={"ID":"d95c4bbd-337f-454c-a57f-c47db8bc6ecb","Type":"ContainerStarted","Data":"646f7f0bf86b05a1fa0028b5469278323e27ffc5e0fc01847ed8cb51b934a7d8"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.949946 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.950965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-csnmr" event={"ID":"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e","Type":"ContainerStarted","Data":"53b088284112139709a31c6bdc949ccdb9aeed1c6fa6335b0ec8f9388905c693"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.951669 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cfmds container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.951711 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" podUID="d95c4bbd-337f-454c-a57f-c47db8bc6ecb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.953420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" event={"ID":"bb722212-8f18-46b9-b5b5-484352c49bba","Type":"ContainerStarted","Data":"2e123eb464831a51eb90e8fdf01e956318ccebb82aa6adbdb5b51da4a5b6af4c"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.965420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lshv7" event={"ID":"e492bc45-cd6d-4298-b279-0c5f82f91e8c","Type":"ContainerStarted","Data":"7e4365616bef66e7edca447a0175622d85d4811845b00ca79f5fb84f4d931803"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.975489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" event={"ID":"99871c37-c1b7-48c4-bdec-d01234433109","Type":"ContainerStarted","Data":"c9e1dfec3278194cf1f36047fdfabdcfa50081247e2e6f5a5c79606485b25988"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.977364 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" event={"ID":"93cce0d5-b158-4619-95f3-eaa50b8d9af4","Type":"ContainerStarted","Data":"20cc7e9239d8b1dd8498e5eb59fc81456f5e70be7f7e1bc377442d076aedb434"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.991760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" event={"ID":"ec35d298-1364-4b8b-9a47-70fa26b2aa64","Type":"ContainerStarted","Data":"98920d961229bf23ec6c008519c72b7db0cdd2c54c041c4511964df125e2a26c"} Oct 08 12:58:14 crc kubenswrapper[4735]: I1008 12:58:14.997879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" event={"ID":"a85dc80e-3e52-4eb0-a5c9-9b3c4d6e84d5","Type":"ContainerStarted","Data":"026092bc6cc6cc13c12f13c077e2cc20b1fbdeeb340704a45ec1f8747775ab4d"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.000102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" event={"ID":"87ee9fbe-46aa-4075-b289-ef083cea461b","Type":"ContainerStarted","Data":"2543036d315638e845dc1048af985c71d01e4f2a0649b788a960f638aea84b80"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.003547 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" event={"ID":"0759b5fe-0415-43f8-b773-99248d05630f","Type":"ContainerStarted","Data":"1ea1dce85571e28ce8cd20192d8e46c3bbad2e27cefcd60504634b2e39c9c0b2"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.003589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" event={"ID":"0759b5fe-0415-43f8-b773-99248d05630f","Type":"ContainerStarted","Data":"cc49a5fef3301ae364980043c6927cf39c05f639a11abf7e3c5cf30a35946ca4"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.008397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" event={"ID":"61708c63-4a5a-4e83-8243-02025b05df64","Type":"ContainerStarted","Data":"365d530fe312303cc6f12178156f621a6272992d103147c27973d3549ef06143"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.027530 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7nmkc" event={"ID":"2bf3f935-07ed-496e-8426-da64835d0c65","Type":"ContainerStarted","Data":"12c33ac380a0ef7eb36383c6f8e07cfe4b568b28f8591d000919623b8ca9a469"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.030275 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.031347 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.531320118 +0000 UTC m=+145.339624722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.033224 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" event={"ID":"58d4a0d7-f90f-497b-8744-dac2ddd08868","Type":"ContainerStarted","Data":"9f09a756c691da722711b3ccb1274750784c7be9b327f7f9c56b01de64366781"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.035563 4735 generic.go:334] "Generic (PLEG): container finished" podID="c881e3aa-6bfe-476c-9e63-edfa5480745d" containerID="9d677201775a35e087b5f5b55ab98128343b9c1ce60ec95332f1e85564066c31" exitCode=0 Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.035620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" event={"ID":"c881e3aa-6bfe-476c-9e63-edfa5480745d","Type":"ContainerDied","Data":"9d677201775a35e087b5f5b55ab98128343b9c1ce60ec95332f1e85564066c31"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.047519 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" event={"ID":"68f7f9bf-73a9-4109-a07c-e026b3d11513","Type":"ContainerStarted","Data":"86ceb7a725816f4beb908544dec8cc0c5cbe42340a1c93eec5b291a65a637a97"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.049006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" event={"ID":"543b04b8-1dda-487e-9382-ba18d7be25e5","Type":"ContainerStarted","Data":"578270bdc8e265a342040ac0983b4f59e6e3f81c76b7e39017a6fa4c1cc0745f"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.051369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" event={"ID":"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a","Type":"ContainerStarted","Data":"35e90935d00d7b2855b115d9ec7ff4aed509f6aea04e485d721b5e55c86a0490"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.052752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" event={"ID":"d169069b-e5fe-4018-974d-97cf1c1cad23","Type":"ContainerStarted","Data":"9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.053438 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.054642 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gp925 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.054684 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.065285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" event={"ID":"d316a645-bd82-4695-a79d-d31896b704eb","Type":"ContainerStarted","Data":"461b44df153f7b3e9890bf8d71071146ac91963b4c4d6827628a55a0f521c465"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.065344 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" event={"ID":"d316a645-bd82-4695-a79d-d31896b704eb","Type":"ContainerStarted","Data":"a34e95864d1b43b0836a4bdb11118efa85bf69fd184386f4092262d3ce131649"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.070409 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" event={"ID":"d24dc2f4-b906-49c0-be1e-f2326ef12664","Type":"ContainerStarted","Data":"cff232d3171dab2f619e14df9496a6dbd6a834ea4ffb8a197aea8560893d4a25"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.074127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" event={"ID":"15a08015-bcc0-4576-8f54-70e5693469a4","Type":"ContainerStarted","Data":"f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.074523 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.075901 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8rqp7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.075937 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.076256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" event={"ID":"fdf78d81-ae87-48c8-b189-eff27030dd91","Type":"ContainerStarted","Data":"47c21c80848762dbd03a8932502cbf0c44afc1cffee6ae64c32728593f0c13e9"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.097766 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" event={"ID":"e4c4252c-e8f3-411c-8d61-9ed06ae9a5de","Type":"ContainerStarted","Data":"ebc96b1c0e50730f403e08c283c9da218106b2f2e35cc7cfda30a88c7e4f0fed"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.103174 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.111620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" event={"ID":"49e6e79f-93b3-453f-83d1-2afc40a4637c","Type":"ContainerStarted","Data":"f7568c93b35a26c2547d77c4962c3ad18071544a35b177809be0f73830ea8077"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.112624 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.121781 4735 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rgtx4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.121831 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" podUID="49e6e79f-93b3-453f-83d1-2afc40a4637c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.124201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" event={"ID":"97f188e2-ee83-4fb4-b334-33c3099df3f2","Type":"ContainerStarted","Data":"85c57ba4378e3e2c7b116bce352a05029a1ddcfea54660d2454678b06f32d903"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.139570 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" event={"ID":"79d2bd38-a2b9-4594-89bf-83c365f75f40","Type":"ContainerStarted","Data":"04dd04f0b42bb069ecc77501f10ecc0e2dfda27e9ce83b6485e9a8e29d58e486"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.139855 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.159880 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.659845835 +0000 UTC m=+145.468150549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.176517 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" event={"ID":"e100c510-2957-4946-b21d-e9227e32d7b0","Type":"ContainerStarted","Data":"09cd12cf46a7a3a76f66a45a444b08e440f7e7ef21bbfdbe7b39c27e8bf832de"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.185697 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" event={"ID":"6e891ada-e4a4-4e15-8ade-48d0f40518da","Type":"ContainerStarted","Data":"48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.186090 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.188679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-887mh" event={"ID":"14f021da-25d5-4a75-82f3-b6c20c65dc7c","Type":"ContainerStarted","Data":"b6a7d0bb09a252e94db79774242273b55416fc9b063a0ec1ecbc5d1d5d6e1ab4"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.191606 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" event={"ID":"477a3236-05fd-494b-8eb3-fd178d3180be","Type":"ContainerStarted","Data":"aad4f93dcaeac93404012eee115f06e393c368a4d27f92481cd57224fd0ba5dd"} Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.201869 4735 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xqfrr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.201920 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" podUID="6e891ada-e4a4-4e15-8ade-48d0f40518da" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.203593 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcmct" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.259296 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.262725 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.762691151 +0000 UTC m=+145.570995765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.361375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.363618 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.863606038 +0000 UTC m=+145.671910582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.458161 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7nmkc" podStartSLOduration=6.4581416 podStartE2EDuration="6.4581416s" podCreationTimestamp="2025-10-08 12:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.457322529 +0000 UTC m=+145.265627073" watchObservedRunningTime="2025-10-08 12:58:15.4581416 +0000 UTC m=+145.266446144" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.465118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.465232 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.965213012 +0000 UTC m=+145.773517556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.465579 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.465867 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:15.965858239 +0000 UTC m=+145.774162783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.503732 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wd7cr" podStartSLOduration=124.503712643 podStartE2EDuration="2m4.503712643s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.502906732 +0000 UTC m=+145.311211286" watchObservedRunningTime="2025-10-08 12:58:15.503712643 +0000 UTC m=+145.312017187" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.547723 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" podStartSLOduration=123.547703705 podStartE2EDuration="2m3.547703705s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.546687959 +0000 UTC m=+145.354992503" watchObservedRunningTime="2025-10-08 12:58:15.547703705 +0000 UTC m=+145.356008239" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.567192 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.567705 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.067686049 +0000 UTC m=+145.875990593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.581125 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:15 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:15 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:15 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.581178 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.597118 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" podStartSLOduration=124.597098126 podStartE2EDuration="2m4.597098126s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.59532871 +0000 UTC m=+145.403633264" watchObservedRunningTime="2025-10-08 12:58:15.597098126 +0000 UTC m=+145.405402680" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.637233 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" podStartSLOduration=123.637218128 podStartE2EDuration="2m3.637218128s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.636405107 +0000 UTC m=+145.444709671" watchObservedRunningTime="2025-10-08 12:58:15.637218128 +0000 UTC m=+145.445522672" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.669342 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.669782 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.169766735 +0000 UTC m=+145.978071289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.703135 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" podStartSLOduration=124.703121664 podStartE2EDuration="2m4.703121664s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.702117818 +0000 UTC m=+145.510422362" watchObservedRunningTime="2025-10-08 12:58:15.703121664 +0000 UTC m=+145.511426198" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.770419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.770621 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.27059149 +0000 UTC m=+146.078896034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.771303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.771806 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.27178959 +0000 UTC m=+146.080094144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.797267 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" podStartSLOduration=123.797243495 podStartE2EDuration="2m3.797243495s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.784917498 +0000 UTC m=+145.593222062" watchObservedRunningTime="2025-10-08 12:58:15.797243495 +0000 UTC m=+145.605548039" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.878233 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.879290 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.379264366 +0000 UTC m=+146.187568910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.913016 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-p5f5l" podStartSLOduration=124.912989043 podStartE2EDuration="2m4.912989043s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.858242575 +0000 UTC m=+145.666547119" watchObservedRunningTime="2025-10-08 12:58:15.912989043 +0000 UTC m=+145.721293587" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.913903 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmrzp" podStartSLOduration=124.913898297 podStartE2EDuration="2m4.913898297s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.888115733 +0000 UTC m=+145.696420277" watchObservedRunningTime="2025-10-08 12:58:15.913898297 +0000 UTC m=+145.722202841" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.954869 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" podStartSLOduration=123.95484785 podStartE2EDuration="2m3.95484785s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:15.944068603 +0000 UTC m=+145.752373147" watchObservedRunningTime="2025-10-08 12:58:15.95484785 +0000 UTC m=+145.763152394" Oct 08 12:58:15 crc kubenswrapper[4735]: I1008 12:58:15.986755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:15 crc kubenswrapper[4735]: E1008 12:58:15.987090 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.4870782 +0000 UTC m=+146.295382744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.011011 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-75k9v" podStartSLOduration=125.010994315 podStartE2EDuration="2m5.010994315s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.009808284 +0000 UTC m=+145.818112828" watchObservedRunningTime="2025-10-08 12:58:16.010994315 +0000 UTC m=+145.819298859" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.087613 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.087811 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.587777431 +0000 UTC m=+146.396081975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.088152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.088466 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.588459088 +0000 UTC m=+146.396763632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.189857 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.189986 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.68996097 +0000 UTC m=+146.498265514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.190166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.190517 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.690498504 +0000 UTC m=+146.498803048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.216976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" event={"ID":"97f188e2-ee83-4fb4-b334-33c3099df3f2","Type":"ContainerStarted","Data":"6cf286db8cf9011d847cc3d58feeba07186d2a080cad4bd4c213f761df859860"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.238182 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" event={"ID":"79d2bd38-a2b9-4594-89bf-83c365f75f40","Type":"ContainerStarted","Data":"7a4f55e149f5aec9cea1dca2900f56b09f53b2194983e3a62cd84a42b9062515"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.256618 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjb22" podStartSLOduration=124.256599094 podStartE2EDuration="2m4.256599094s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.252109609 +0000 UTC m=+146.060414153" watchObservedRunningTime="2025-10-08 12:58:16.256599094 +0000 UTC m=+146.064903638" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.258148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" event={"ID":"68f7f9bf-73a9-4109-a07c-e026b3d11513","Type":"ContainerStarted","Data":"a5a5ca51d85f4cfeacb6a122aa3bd25c2239f49d9a48a0d46dd51d9278f3d53c"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.280324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-csnmr" event={"ID":"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e","Type":"ContainerStarted","Data":"edaae202fe7ecae4ec4ba26c275b61d213601f1c1e822d81e15e9e2015fbaa59"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.291769 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.291976 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.791946054 +0000 UTC m=+146.600250598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.292316 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.292837 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.792826267 +0000 UTC m=+146.601130811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.300525 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" event={"ID":"93cce0d5-b158-4619-95f3-eaa50b8d9af4","Type":"ContainerStarted","Data":"5389c3006c64b00e2fd3278755a075217ee591a2311848de5bb48a47583d90b7"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.301008 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t8cv2" podStartSLOduration=124.300998087 podStartE2EDuration="2m4.300998087s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.275532612 +0000 UTC m=+146.083837156" watchObservedRunningTime="2025-10-08 12:58:16.300998087 +0000 UTC m=+146.109302631" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.302958 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fxkkf" podStartSLOduration=124.302952007 podStartE2EDuration="2m4.302952007s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.300508194 +0000 UTC m=+146.108812738" watchObservedRunningTime="2025-10-08 12:58:16.302952007 +0000 UTC m=+146.111256551" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.334036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" event={"ID":"58d4a0d7-f90f-497b-8744-dac2ddd08868","Type":"ContainerStarted","Data":"6dea9aeb47d27cda11083d9dfc61124a7a86fd8f1a30d94a708c09c12d5d5397"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.362646 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-cnm5q" podStartSLOduration=125.362631853 podStartE2EDuration="2m5.362631853s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.361294728 +0000 UTC m=+146.169599292" watchObservedRunningTime="2025-10-08 12:58:16.362631853 +0000 UTC m=+146.170936397" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.362924 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" podStartSLOduration=125.36291704 podStartE2EDuration="2m5.36291704s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.334364215 +0000 UTC m=+146.142668769" watchObservedRunningTime="2025-10-08 12:58:16.36291704 +0000 UTC m=+146.171221584" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.369771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" event={"ID":"d316a645-bd82-4695-a79d-d31896b704eb","Type":"ContainerStarted","Data":"fb6387ebc5932e48db8ff9453b643e4dcc296f2416750319c75afcaaff8d12f6"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.391693 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gcn29" podStartSLOduration=124.39167915 podStartE2EDuration="2m4.39167915s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.389545235 +0000 UTC m=+146.197849789" watchObservedRunningTime="2025-10-08 12:58:16.39167915 +0000 UTC m=+146.199983694" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.393943 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.395362 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.895340844 +0000 UTC m=+146.703645398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.398881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" event={"ID":"e100c510-2957-4946-b21d-e9227e32d7b0","Type":"ContainerStarted","Data":"b1be1e59a742dc25b089dac055d8b4ec6ce3a63900dc27ed1f5a425e00cb7904"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.400108 4735 generic.go:334] "Generic (PLEG): container finished" podID="61708c63-4a5a-4e83-8243-02025b05df64" containerID="82a445f8d1b0a6f5d7dc54e44d1d8cc232d402a4aac469814a2d1601adf9a125" exitCode=0 Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.400160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" event={"ID":"61708c63-4a5a-4e83-8243-02025b05df64","Type":"ContainerDied","Data":"82a445f8d1b0a6f5d7dc54e44d1d8cc232d402a4aac469814a2d1601adf9a125"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.423231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" event={"ID":"99871c37-c1b7-48c4-bdec-d01234433109","Type":"ContainerStarted","Data":"0300d1dc5c09c56794c746a2b24eb9866e29f2dae9b62d03ad761b6824fd6f99"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.458750 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" event={"ID":"9c787ebd-7a0e-40ae-b537-e09ae9f12578","Type":"ContainerStarted","Data":"e8d013b245e65e510b3977de0b6a9a634ee1bd4e7388141804cd7661991cd509"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.497092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.498549 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:16.998525989 +0000 UTC m=+146.806830603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.510613 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" event={"ID":"c881e3aa-6bfe-476c-9e63-edfa5480745d","Type":"ContainerStarted","Data":"81675f78e3aa9f556813cf2a395da47d640c8fda217c46c1a6a022df33979beb"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.527596 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xkrqh" podStartSLOduration=124.527579227 podStartE2EDuration="2m4.527579227s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.521636544 +0000 UTC m=+146.329941088" watchObservedRunningTime="2025-10-08 12:58:16.527579227 +0000 UTC m=+146.335883771" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.561950 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lshv7" event={"ID":"e492bc45-cd6d-4298-b279-0c5f82f91e8c","Type":"ContainerStarted","Data":"9c276466f68c2eec8f295a7eb69ce06e2db780b33ad0d46bf744a5bbd29fd490"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.580667 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:16 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:16 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:16 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.580723 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.589647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" event={"ID":"543b04b8-1dda-487e-9382-ba18d7be25e5","Type":"ContainerStarted","Data":"900d51cdd28431246aa5c429a0c1bbf5f288d324010f3582d5e797cd34a96a20"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.599428 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.600632 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.100606516 +0000 UTC m=+146.908911060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.624449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" event={"ID":"e60e2365-ae6c-44a2-a7a3-a0c67f0d756a","Type":"ContainerStarted","Data":"302c4173d3326fd094bacaf20a4d61ebd79d601111a335d869cfeb1a873253f9"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.652973 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lshv7" podStartSLOduration=7.652958153 podStartE2EDuration="7.652958153s" podCreationTimestamp="2025-10-08 12:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.652773758 +0000 UTC m=+146.461078302" watchObservedRunningTime="2025-10-08 12:58:16.652958153 +0000 UTC m=+146.461262697" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.654925 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-djbhr" podStartSLOduration=124.654918503 podStartE2EDuration="2m4.654918503s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.588975476 +0000 UTC m=+146.397280040" watchObservedRunningTime="2025-10-08 12:58:16.654918503 +0000 UTC m=+146.463223037" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.667650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" event={"ID":"bb722212-8f18-46b9-b5b5-484352c49bba","Type":"ContainerStarted","Data":"68631f335b773baad856df3f7a340cc311e665185b89c3c20321f32e37426608"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.694711 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" event={"ID":"25df5801-9a21-4396-8613-abddc02daf42","Type":"ContainerStarted","Data":"ff0b6454c1c7dabd6f2963b36cd0120ae412f98e8daac7f75bd593ab98839d8c"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.709291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.711650 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.211639072 +0000 UTC m=+147.019943616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.716077 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" podStartSLOduration=125.716060966 podStartE2EDuration="2m5.716060966s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.713781788 +0000 UTC m=+146.522086332" watchObservedRunningTime="2025-10-08 12:58:16.716060966 +0000 UTC m=+146.524365510" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.718822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" event={"ID":"73926bea-3da3-4e00-a78c-b0c33b4b1919","Type":"ContainerStarted","Data":"1a74e3e176bdff6b124d949ae88defe96190486e4a358dfc0a6ec969fcfce01b"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.725736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-887mh" event={"ID":"14f021da-25d5-4a75-82f3-b6c20c65dc7c","Type":"ContainerStarted","Data":"f6081098108b7d60aa8325308d2f4e6c89e9ce10d33f45caa8fea97badc49430"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.726662 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.730615 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.730692 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.742518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" event={"ID":"477a3236-05fd-494b-8eb3-fd178d3180be","Type":"ContainerStarted","Data":"5a3c829f3ce27a32b933b14ba7c763daf925e7204a31d6faf795521aac6fa8ed"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.743610 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.746730 4735 patch_prober.go:28] interesting pod/console-operator-58897d9998-bqm6j container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.746787 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" podUID="477a3236-05fd-494b-8eb3-fd178d3180be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.754653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" event={"ID":"83bd6935-d7c7-4548-add9-647cdfcd15ee","Type":"ContainerStarted","Data":"be098fa36c710e9abd5fa0df87415b039ff7ab50885d0e9578ce547b08990e15"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.773738 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" event={"ID":"fdf78d81-ae87-48c8-b189-eff27030dd91","Type":"ContainerStarted","Data":"55d15f00693b76777e3ef158c143f1fd31d6d34d9075d2068bb5a85a930d3d78"} Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.773782 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.776467 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gp925 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.776543 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.778222 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8rqp7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.778309 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.778385 4735 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bw544 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.778427 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" podUID="fdf78d81-ae87-48c8-b189-eff27030dd91" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.809977 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.811388 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.311375029 +0000 UTC m=+147.119679573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.826508 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c4j7s" podStartSLOduration=124.826487457 podStartE2EDuration="2m4.826487457s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.811680776 +0000 UTC m=+146.619985320" watchObservedRunningTime="2025-10-08 12:58:16.826487457 +0000 UTC m=+146.634792001" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.890393 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" podStartSLOduration=124.890364251 podStartE2EDuration="2m4.890364251s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.880843706 +0000 UTC m=+146.689148260" watchObservedRunningTime="2025-10-08 12:58:16.890364251 +0000 UTC m=+146.698668805" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.913763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:16 crc kubenswrapper[4735]: E1008 12:58:16.914062 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.41405008 +0000 UTC m=+147.222354624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.974798 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-887mh" podStartSLOduration=125.974764111 podStartE2EDuration="2m5.974764111s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.973757436 +0000 UTC m=+146.782061980" watchObservedRunningTime="2025-10-08 12:58:16.974764111 +0000 UTC m=+146.783068655" Oct 08 12:58:16 crc kubenswrapper[4735]: I1008 12:58:16.975863 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xjxt2" podStartSLOduration=124.97585686 podStartE2EDuration="2m4.97585686s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:16.941272531 +0000 UTC m=+146.749577075" watchObservedRunningTime="2025-10-08 12:58:16.97585686 +0000 UTC m=+146.784161404" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.024143 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.024275 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.524247995 +0000 UTC m=+147.332552539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.024906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.025264 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.52525135 +0000 UTC m=+147.333555894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.039578 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zmvkr" podStartSLOduration=125.039542638 podStartE2EDuration="2m5.039542638s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:17.009881725 +0000 UTC m=+146.818186269" watchObservedRunningTime="2025-10-08 12:58:17.039542638 +0000 UTC m=+146.847847182" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.086126 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vdgs" podStartSLOduration=125.086092756 podStartE2EDuration="2m5.086092756s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:17.040360149 +0000 UTC m=+146.848664693" watchObservedRunningTime="2025-10-08 12:58:17.086092756 +0000 UTC m=+146.894397300" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.122136 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" podStartSLOduration=125.122119813 podStartE2EDuration="2m5.122119813s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:17.087892562 +0000 UTC m=+146.896197106" watchObservedRunningTime="2025-10-08 12:58:17.122119813 +0000 UTC m=+146.930424357" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.126736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.127122 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.627104331 +0000 UTC m=+147.435408875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.150535 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.150599 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.228522 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.228847 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.728835879 +0000 UTC m=+147.537140423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.329300 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.329468 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.829444167 +0000 UTC m=+147.637748701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.329611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.329884 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.829872838 +0000 UTC m=+147.638177382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.412001 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.430672 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.431061 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.931040031 +0000 UTC m=+147.739344575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.431186 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.431577 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:17.931565775 +0000 UTC m=+147.739870319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.440517 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" podStartSLOduration=126.440501535 podStartE2EDuration="2m6.440501535s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:17.124694759 +0000 UTC m=+146.932999303" watchObservedRunningTime="2025-10-08 12:58:17.440501535 +0000 UTC m=+147.248806079" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.529653 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.532645 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.533054 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.033035346 +0000 UTC m=+147.841339890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.575053 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:17 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:17 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:17 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.575116 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.634065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.634515 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.134497496 +0000 UTC m=+147.942802040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.734845 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.735276 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.235254879 +0000 UTC m=+148.043559433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.779743 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cfmds container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.779825 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" podUID="d95c4bbd-337f-454c-a57f-c47db8bc6ecb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.783513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" event={"ID":"e100c510-2957-4946-b21d-e9227e32d7b0","Type":"ContainerStarted","Data":"ef0a349793deec287df9df6c76485490d42cf284145775e9557957047f177cd4"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.786215 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" event={"ID":"61708c63-4a5a-4e83-8243-02025b05df64","Type":"ContainerStarted","Data":"537285fed9e491f7cdea6bc3e860a7a84e3176228409d77331147eb8ad7ed71b"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.786726 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.788614 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" event={"ID":"c881e3aa-6bfe-476c-9e63-edfa5480745d","Type":"ContainerStarted","Data":"330a1c3a5ab25b757dc4907022f0523634b68e2d030445f05e310e5413c1b5e9"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.791231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-csnmr" event={"ID":"0b7e3bce-3aca-4e94-8d98-11009fc0ab3e","Type":"ContainerStarted","Data":"dc4d9b44675c9b1573cc2f8385abb3c8b38da2687053f3cbe4566a1b156aad66"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.791256 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.793140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7v5q4" event={"ID":"543b04b8-1dda-487e-9382-ba18d7be25e5","Type":"ContainerStarted","Data":"5e939179c7884b6322e59173870be2a961d80a92441c697cab4fb9df62d278c0"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.797908 4735 generic.go:334] "Generic (PLEG): container finished" podID="93cce0d5-b158-4619-95f3-eaa50b8d9af4" containerID="5389c3006c64b00e2fd3278755a075217ee591a2311848de5bb48a47583d90b7" exitCode=0 Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.798066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" event={"ID":"93cce0d5-b158-4619-95f3-eaa50b8d9af4","Type":"ContainerDied","Data":"5389c3006c64b00e2fd3278755a075217ee591a2311848de5bb48a47583d90b7"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.800140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" event={"ID":"87ee9fbe-46aa-4075-b289-ef083cea461b","Type":"ContainerStarted","Data":"957087d02697d8e578c3dddd4d448ad1c6dad7cc167058c0c83ea64d95142a71"} Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.805211 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gp925 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.805257 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.806677 4735 patch_prober.go:28] interesting pod/console-operator-58897d9998-bqm6j container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.806709 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" podUID="477a3236-05fd-494b-8eb3-fd178d3180be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.807919 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.807952 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.822174 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dc7kk" podStartSLOduration=125.822156975 podStartE2EDuration="2m5.822156975s" podCreationTimestamp="2025-10-08 12:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:17.818025158 +0000 UTC m=+147.626329702" watchObservedRunningTime="2025-10-08 12:58:17.822156975 +0000 UTC m=+147.630461519" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.828157 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bw544" Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.836198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.836575 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.336565195 +0000 UTC m=+148.144869739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.938142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.942312 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.442288486 +0000 UTC m=+148.250593030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.949024 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:17 crc kubenswrapper[4735]: E1008 12:58:17.953468 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.453443433 +0000 UTC m=+148.261747977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:17 crc kubenswrapper[4735]: I1008 12:58:17.988744 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-csnmr" podStartSLOduration=8.98872256 podStartE2EDuration="8.98872256s" podCreationTimestamp="2025-10-08 12:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:17.98714256 +0000 UTC m=+147.795447114" watchObservedRunningTime="2025-10-08 12:58:17.98872256 +0000 UTC m=+147.797027104" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.030890 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" podStartSLOduration=127.030867825 podStartE2EDuration="2m7.030867825s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:18.029879669 +0000 UTC m=+147.838184213" watchObservedRunningTime="2025-10-08 12:58:18.030867825 +0000 UTC m=+147.839172369" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.061359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.061863 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.561840542 +0000 UTC m=+148.370145086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.119986 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.124795 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" podStartSLOduration=127.124775511 podStartE2EDuration="2m7.124775511s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:18.089175505 +0000 UTC m=+147.897480049" watchObservedRunningTime="2025-10-08 12:58:18.124775511 +0000 UTC m=+147.933080055" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.162847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.163199 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.663184939 +0000 UTC m=+148.471489483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.263997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.264173 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.764146447 +0000 UTC m=+148.572450991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.264526 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.264911 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.764900756 +0000 UTC m=+148.573205370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.366165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.366403 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.866364047 +0000 UTC m=+148.674668591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.366850 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.367309 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.867292671 +0000 UTC m=+148.675597215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.467774 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.467874 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.967859718 +0000 UTC m=+148.776164262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.468064 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.468360 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:18.968351711 +0000 UTC m=+148.776656255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.522737 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfmds" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.569207 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.569592 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.069577056 +0000 UTC m=+148.877881600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.573622 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:18 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:18 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:18 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.573676 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.670531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.670857 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.170846401 +0000 UTC m=+148.979150945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.771234 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.771512 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.771603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.771646 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.772834 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.272791034 +0000 UTC m=+149.081095578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.776633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.778695 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.778711 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.807220 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" event={"ID":"87ee9fbe-46aa-4075-b289-ef083cea461b","Type":"ContainerStarted","Data":"8ce3fd995a43996d3b3f1b1be62234e3069931a75b5e0d1249f6cd51cea47cc6"} Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.807651 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" event={"ID":"87ee9fbe-46aa-4075-b289-ef083cea461b","Type":"ContainerStarted","Data":"4ef1461ac9cdc75acd921cd9685aff51f53e57cf4dd12ccc1d32425d32cff7ec"} Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.808152 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.808202 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.819408 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.825904 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gq4dv" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.833759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.873162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.873245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.873822 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.373803203 +0000 UTC m=+149.182107817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.902903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:58:18 crc kubenswrapper[4735]: I1008 12:58:18.974145 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:18 crc kubenswrapper[4735]: E1008 12:58:18.976152 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.476132646 +0000 UTC m=+149.284437190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.078694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.079165 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.579149157 +0000 UTC m=+149.387453691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.114063 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.180014 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.180436 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.680416472 +0000 UTC m=+149.488721016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.307466 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.308330 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.808298982 +0000 UTC m=+149.616603526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.383769 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5hv5"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.385056 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.397463 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.412903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.413385 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-utilities\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.413459 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.913417327 +0000 UTC m=+149.721721871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.413572 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-catalog-content\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.413652 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtd2\" (UniqueName: \"kubernetes.io/projected/33b36b08-22c3-49b3-805e-bd38f2235a1e-kube-api-access-7xtd2\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.413695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.414139 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:19.914127125 +0000 UTC m=+149.722431669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.464079 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5hv5"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.501997 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.515646 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9rdp\" (UniqueName: \"kubernetes.io/projected/93cce0d5-b158-4619-95f3-eaa50b8d9af4-kube-api-access-r9rdp\") pod \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.515767 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93cce0d5-b158-4619-95f3-eaa50b8d9af4-config-volume\") pod \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.515802 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93cce0d5-b158-4619-95f3-eaa50b8d9af4-secret-volume\") pod \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\" (UID: \"93cce0d5-b158-4619-95f3-eaa50b8d9af4\") " Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.516005 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.516231 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-catalog-content\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.516259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtd2\" (UniqueName: \"kubernetes.io/projected/33b36b08-22c3-49b3-805e-bd38f2235a1e-kube-api-access-7xtd2\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.516309 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-utilities\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.518557 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:20.018509951 +0000 UTC m=+149.826814495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.518927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-utilities\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.519072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-catalog-content\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.519267 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93cce0d5-b158-4619-95f3-eaa50b8d9af4-config-volume" (OuterVolumeSpecName: "config-volume") pod "93cce0d5-b158-4619-95f3-eaa50b8d9af4" (UID: "93cce0d5-b158-4619-95f3-eaa50b8d9af4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.551362 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cce0d5-b158-4619-95f3-eaa50b8d9af4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93cce0d5-b158-4619-95f3-eaa50b8d9af4" (UID: "93cce0d5-b158-4619-95f3-eaa50b8d9af4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.554749 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93cce0d5-b158-4619-95f3-eaa50b8d9af4-kube-api-access-r9rdp" (OuterVolumeSpecName: "kube-api-access-r9rdp") pod "93cce0d5-b158-4619-95f3-eaa50b8d9af4" (UID: "93cce0d5-b158-4619-95f3-eaa50b8d9af4"). InnerVolumeSpecName "kube-api-access-r9rdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.563663 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtd2\" (UniqueName: \"kubernetes.io/projected/33b36b08-22c3-49b3-805e-bd38f2235a1e-kube-api-access-7xtd2\") pod \"certified-operators-d5hv5\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.568791 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bqm6j" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.586805 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:19 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:19 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:19 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.586887 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.593520 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tct8h"] Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.593803 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cce0d5-b158-4619-95f3-eaa50b8d9af4" containerName="collect-profiles" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.593819 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cce0d5-b158-4619-95f3-eaa50b8d9af4" containerName="collect-profiles" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.593959 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cce0d5-b158-4619-95f3-eaa50b8d9af4" containerName="collect-profiles" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.601287 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.639934 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.641425 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r79sh\" (UniqueName: \"kubernetes.io/projected/6a47c541-ef48-4097-843e-609ef561cadc-kube-api-access-r79sh\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.641542 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-catalog-content\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.641603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.641812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-utilities\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.642391 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:20.142373778 +0000 UTC m=+149.950678322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.685012 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9rdp\" (UniqueName: \"kubernetes.io/projected/93cce0d5-b158-4619-95f3-eaa50b8d9af4-kube-api-access-r9rdp\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.685159 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93cce0d5-b158-4619-95f3-eaa50b8d9af4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.685186 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93cce0d5-b158-4619-95f3-eaa50b8d9af4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.715666 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tct8h"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.721126 4735 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.736594 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.763971 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mgkfd"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.784763 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786413 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nn5\" (UniqueName: \"kubernetes.io/projected/cbe9af65-d05a-4224-a554-361d0d7e9e11-kube-api-access-76nn5\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r79sh\" (UniqueName: \"kubernetes.io/projected/6a47c541-ef48-4097-843e-609ef561cadc-kube-api-access-r79sh\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786499 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-catalog-content\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786539 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-utilities\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.786563 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:20.286537127 +0000 UTC m=+150.094841671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786612 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-catalog-content\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.786657 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-utilities\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.787217 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-utilities\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.795420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-catalog-content\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.806417 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jnr7v" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.844611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r79sh\" (UniqueName: \"kubernetes.io/projected/6a47c541-ef48-4097-843e-609ef561cadc-kube-api-access-r79sh\") pod \"community-operators-tct8h\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.851191 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgkfd"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.866143 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" event={"ID":"93cce0d5-b158-4619-95f3-eaa50b8d9af4","Type":"ContainerDied","Data":"20cc7e9239d8b1dd8498e5eb59fc81456f5e70be7f7e1bc377442d076aedb434"} Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.866185 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20cc7e9239d8b1dd8498e5eb59fc81456f5e70be7f7e1bc377442d076aedb434" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.866300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.898303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76nn5\" (UniqueName: \"kubernetes.io/projected/cbe9af65-d05a-4224-a554-361d0d7e9e11-kube-api-access-76nn5\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.898404 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.898440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-utilities\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.898458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-catalog-content\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.898866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-catalog-content\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: E1008 12:58:19.899352 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:20.39934063 +0000 UTC m=+150.207645174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.899619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-utilities\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.914447 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rzs56"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.916200 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.950608 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzs56"] Oct 08 12:58:19 crc kubenswrapper[4735]: I1008 12:58:19.964459 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.002605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.002868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-utilities\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.002965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-catalog-content\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.002990 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5blsh\" (UniqueName: \"kubernetes.io/projected/fb3785a1-c82f-48bd-8032-dee94a15b75a-kube-api-access-5blsh\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: E1008 12:58:20.004189 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 12:58:20.504167217 +0000 UTC m=+150.312471761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.007230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nn5\" (UniqueName: \"kubernetes.io/projected/cbe9af65-d05a-4224-a554-361d0d7e9e11-kube-api-access-76nn5\") pod \"certified-operators-mgkfd\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.069041 4735 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-08T12:58:19.721148755Z","Handler":null,"Name":""} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.104582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-utilities\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.107493 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-catalog-content\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.107528 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5blsh\" (UniqueName: \"kubernetes.io/projected/fb3785a1-c82f-48bd-8032-dee94a15b75a-kube-api-access-5blsh\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.107643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.109435 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-catalog-content\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.109840 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-utilities\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: E1008 12:58:20.123181 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 12:58:20.623159979 +0000 UTC m=+150.431464513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vgv9c" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.134141 4735 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.134183 4735 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.142085 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.154579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5blsh\" (UniqueName: \"kubernetes.io/projected/fb3785a1-c82f-48bd-8032-dee94a15b75a-kube-api-access-5blsh\") pod \"community-operators-rzs56\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.216135 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.237750 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.268838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzs56" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.318360 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.340343 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.340382 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.354256 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5hv5"] Oct 08 12:58:20 crc kubenswrapper[4735]: W1008 12:58:20.458028 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b36b08_22c3_49b3_805e_bd38f2235a1e.slice/crio-00895780c53dbbd3c1799ec7c22c867c482d8d4c31de9833428314f3d37115d3 WatchSource:0}: Error finding container 00895780c53dbbd3c1799ec7c22c867c482d8d4c31de9833428314f3d37115d3: Status 404 returned error can't find the container with id 00895780c53dbbd3c1799ec7c22c867c482d8d4c31de9833428314f3d37115d3 Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.491382 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vgv9c\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.516555 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tct8h"] Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.522882 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.629767 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:20 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:20 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:20 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.629832 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:20 crc kubenswrapper[4735]: W1008 12:58:20.653661 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a47c541_ef48_4097_843e_609ef561cadc.slice/crio-6ccc82be60b1d17671654a0ed99093d32d68112607ce2a6e306710e2f4f852fe WatchSource:0}: Error finding container 6ccc82be60b1d17671654a0ed99093d32d68112607ce2a6e306710e2f4f852fe: Status 404 returned error can't find the container with id 6ccc82be60b1d17671654a0ed99093d32d68112607ce2a6e306710e2f4f852fe Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.785901 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgkfd"] Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.884822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b0692315f2127ff05773f38b19d1ddbf560f9972f7b9f17f41fe2c2848887c5b"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.885141 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"58d088ef0bca767e55bf9eedfeda84e38e40d6b86dce4603a0effe21c3eba744"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.886069 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.914035 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.935239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5hv5" event={"ID":"33b36b08-22c3-49b3-805e-bd38f2235a1e","Type":"ContainerStarted","Data":"00895780c53dbbd3c1799ec7c22c867c482d8d4c31de9833428314f3d37115d3"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.935274 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"507d744698cd795fa8a63cf0fdb524dde04e18b06a4d982dae194f0c10f5fc20"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.935292 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgv9c"] Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.935321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tct8h" event={"ID":"6a47c541-ef48-4097-843e-609ef561cadc","Type":"ContainerStarted","Data":"6ccc82be60b1d17671654a0ed99093d32d68112607ce2a6e306710e2f4f852fe"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.935334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgkfd" event={"ID":"cbe9af65-d05a-4224-a554-361d0d7e9e11","Type":"ContainerStarted","Data":"36b8f24841d52c56bafb13573dfd90c1c40a034d86377aa7110f65b4def0c7a3"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.939078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9261ae8067ada92260695c2ac94d83488e03cfc13e7c6453aeb365f8d26db717"} Oct 08 12:58:20 crc kubenswrapper[4735]: I1008 12:58:20.957413 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" event={"ID":"87ee9fbe-46aa-4075-b289-ef083cea461b","Type":"ContainerStarted","Data":"e738a8a841ff5fde3d24b74415edfa7c3a4b07f3516f2ec64e3a406086c85df9"} Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.098704 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzs56"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.291872 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4drk9" podStartSLOduration=12.291849917 podStartE2EDuration="12.291849917s" podCreationTimestamp="2025-10-08 12:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:21.157051379 +0000 UTC m=+150.965355923" watchObservedRunningTime="2025-10-08 12:58:21.291849917 +0000 UTC m=+151.100154461" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.294022 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qm4jg"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.295282 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.297143 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.354576 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qm4jg"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.460628 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-catalog-content\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.460677 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-utilities\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.460696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtm22\" (UniqueName: \"kubernetes.io/projected/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-kube-api-access-qtm22\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.562750 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-catalog-content\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.562821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-utilities\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.562852 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtm22\" (UniqueName: \"kubernetes.io/projected/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-kube-api-access-qtm22\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.563690 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-catalog-content\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.563910 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-utilities\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.575083 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:21 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:21 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:21 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.575153 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.600668 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtm22\" (UniqueName: \"kubernetes.io/projected/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-kube-api-access-qtm22\") pod \"redhat-marketplace-qm4jg\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.625321 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.682725 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8d"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.683749 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.735308 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8d"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.834620 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.835438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.841256 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.842337 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.847281 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.866831 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-utilities\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.866992 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-catalog-content\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.867069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz4s9\" (UniqueName: \"kubernetes.io/projected/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-kube-api-access-vz4s9\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.899187 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qm4jg"] Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.967150 4735 generic.go:334] "Generic (PLEG): container finished" podID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerID="45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7" exitCode=0 Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.967204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzs56" event={"ID":"fb3785a1-c82f-48bd-8032-dee94a15b75a","Type":"ContainerDied","Data":"45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7"} Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.967267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzs56" event={"ID":"fb3785a1-c82f-48bd-8032-dee94a15b75a","Type":"ContainerStarted","Data":"d742028ce2fc21c37aaeb6b09b7388f590dcb46b964134d2927c64c8c6b44224"} Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.968083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz4s9\" (UniqueName: \"kubernetes.io/projected/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-kube-api-access-vz4s9\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.968124 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.968156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.968210 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-utilities\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.968314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-catalog-content\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.968848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-catalog-content\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.969034 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-utilities\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.969365 4735 generic.go:334] "Generic (PLEG): container finished" podID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerID="0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035" exitCode=0 Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.969420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5hv5" event={"ID":"33b36b08-22c3-49b3-805e-bd38f2235a1e","Type":"ContainerDied","Data":"0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035"} Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.971752 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 12:58:21 crc kubenswrapper[4735]: I1008 12:58:21.975158 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c94b3de72219a1f3050fb6e21738d7ccb05baff4454540b9e0fec5801a9b68ca"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:21.998332 4735 generic.go:334] "Generic (PLEG): container finished" podID="6a47c541-ef48-4097-843e-609ef561cadc" containerID="4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2" exitCode=0 Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:21.998404 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tct8h" event={"ID":"6a47c541-ef48-4097-843e-609ef561cadc","Type":"ContainerDied","Data":"4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.003824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz4s9\" (UniqueName: \"kubernetes.io/projected/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-kube-api-access-vz4s9\") pod \"redhat-marketplace-xsk8d\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.042901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b5d8ac7d4ed5375b73cb7952d1601167e035e93c352b81379959fcaed6454c3c"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.060189 4735 generic.go:334] "Generic (PLEG): container finished" podID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerID="10121c60e1076212ba6ff364c40fd942364f0bc7ea176684bb52099662bbf174" exitCode=0 Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.060712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgkfd" event={"ID":"cbe9af65-d05a-4224-a554-361d0d7e9e11","Type":"ContainerDied","Data":"10121c60e1076212ba6ff364c40fd942364f0bc7ea176684bb52099662bbf174"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.069799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.069852 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.069945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.076300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" event={"ID":"b1f96523-d3ab-4768-8de6-c901e255ae74","Type":"ContainerStarted","Data":"6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.076359 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.076374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" event={"ID":"b1f96523-d3ab-4768-8de6-c901e255ae74","Type":"ContainerStarted","Data":"247ba771553601d28bf07ceade29021d89b60f3a0f95a6c7fad63d09f5ef1c65"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.082826 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.082886 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.083142 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.085893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.086423 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.090434 4735 patch_prober.go:28] interesting pod/console-f9d7485db-nnp5j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.090506 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnp5j" podUID="ecbba0bb-2230-4624-8185-5d116632343f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.096357 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qm4jg" event={"ID":"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb","Type":"ContainerStarted","Data":"ced7da65ee89803635ef5e1268253bcf1d7dce9bba2c70df3bb067c6759198b8"} Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.116032 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.121025 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.128852 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.174527 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.184784 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.274140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaffa46b-d533-4d5a-affa-67b276f93582-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.274926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaffa46b-d533-4d5a-affa-67b276f93582-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.284901 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" podStartSLOduration=131.284853567 podStartE2EDuration="2m11.284853567s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:22.281956382 +0000 UTC m=+152.090260926" watchObservedRunningTime="2025-10-08 12:58:22.284853567 +0000 UTC m=+152.093158121" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.375796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaffa46b-d533-4d5a-affa-67b276f93582-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.375867 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaffa46b-d533-4d5a-affa-67b276f93582-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.375989 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaffa46b-d533-4d5a-affa-67b276f93582-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.388943 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.409785 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaffa46b-d533-4d5a-affa-67b276f93582-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.415956 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.416466 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.437527 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.448200 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.501888 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mmrz8"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.504951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.515889 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.531873 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmrz8"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.539219 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8d"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.569633 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.579740 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:22 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:22 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:22 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.579795 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.583463 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-catalog-content\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.585628 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqp2\" (UniqueName: \"kubernetes.io/projected/9255e2a8-81b4-4743-816c-611181271899-kube-api-access-4vqp2\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.585834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-utilities\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.631227 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.687073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqp2\" (UniqueName: \"kubernetes.io/projected/9255e2a8-81b4-4743-816c-611181271899-kube-api-access-4vqp2\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.687232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-utilities\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.687289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-catalog-content\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.688156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-utilities\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.688763 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-catalog-content\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.708150 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.712564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqp2\" (UniqueName: \"kubernetes.io/projected/9255e2a8-81b4-4743-816c-611181271899-kube-api-access-4vqp2\") pod \"redhat-operators-mmrz8\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.851752 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.920567 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzg9k"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.923012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.923550 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzg9k"] Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.993428 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgfd4\" (UniqueName: \"kubernetes.io/projected/107a4721-dfb9-4d45-abb5-5d2857e0985f-kube-api-access-lgfd4\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.993510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-catalog-content\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:22 crc kubenswrapper[4735]: I1008 12:58:22.993600 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-utilities\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.001921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 12:58:23 crc kubenswrapper[4735]: W1008 12:58:23.036613 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeaffa46b_d533_4d5a_affa_67b276f93582.slice/crio-2c8c69aa42d2befe9e98041da457f7fafb2f91f498eb06e70cdefd4311dc1ea9 WatchSource:0}: Error finding container 2c8c69aa42d2befe9e98041da457f7fafb2f91f498eb06e70cdefd4311dc1ea9: Status 404 returned error can't find the container with id 2c8c69aa42d2befe9e98041da457f7fafb2f91f498eb06e70cdefd4311dc1ea9 Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.095247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-catalog-content\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.095372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-utilities\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.095408 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgfd4\" (UniqueName: \"kubernetes.io/projected/107a4721-dfb9-4d45-abb5-5d2857e0985f-kube-api-access-lgfd4\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.096199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-catalog-content\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.096422 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-utilities\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.109028 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerID="328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa" exitCode=0 Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.109123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qm4jg" event={"ID":"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb","Type":"ContainerDied","Data":"328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa"} Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.113984 4735 generic.go:334] "Generic (PLEG): container finished" podID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerID="dbc7f4702201b962505aa2f2de838bca6bf182373d7c7e657ca1ff322f4ad283" exitCode=0 Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.114080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8d" event={"ID":"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3","Type":"ContainerDied","Data":"dbc7f4702201b962505aa2f2de838bca6bf182373d7c7e657ca1ff322f4ad283"} Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.114119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8d" event={"ID":"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3","Type":"ContainerStarted","Data":"276e982bb04288df9752696c01e1a8e87d7a19fcf811e79272242e9b4d10ae73"} Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.117064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgfd4\" (UniqueName: \"kubernetes.io/projected/107a4721-dfb9-4d45-abb5-5d2857e0985f-kube-api-access-lgfd4\") pod \"redhat-operators-bzg9k\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.120908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eaffa46b-d533-4d5a-affa-67b276f93582","Type":"ContainerStarted","Data":"2c8c69aa42d2befe9e98041da457f7fafb2f91f498eb06e70cdefd4311dc1ea9"} Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.127784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e4c2c24e-ef81-4e1b-9503-212aa8e99730","Type":"ContainerStarted","Data":"b82d3386e05d33159a37afd4ea1aecf55146b912174fcf1e67efe995ac252dd1"} Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.128326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e4c2c24e-ef81-4e1b-9503-212aa8e99730","Type":"ContainerStarted","Data":"8de4e2c7cffa31b690f8c5848eba0e72b7f13b9624cd4d9c96e0fbeae4528125"} Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.137590 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zxtmj" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.151698 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmrz8"] Oct 08 12:58:23 crc kubenswrapper[4735]: W1008 12:58:23.164560 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9255e2a8_81b4_4743_816c_611181271899.slice/crio-2be6eda20f4da36003f3420fe8be620239eda00fb91a81824eb70c314d1de9ff WatchSource:0}: Error finding container 2be6eda20f4da36003f3420fe8be620239eda00fb91a81824eb70c314d1de9ff: Status 404 returned error can't find the container with id 2be6eda20f4da36003f3420fe8be620239eda00fb91a81824eb70c314d1de9ff Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.168170 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.168153104 podStartE2EDuration="2.168153104s" podCreationTimestamp="2025-10-08 12:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:23.16720954 +0000 UTC m=+152.975514084" watchObservedRunningTime="2025-10-08 12:58:23.168153104 +0000 UTC m=+152.976457648" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.256896 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.265236 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.265310 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.268153 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.268220 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.573506 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:23 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:23 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:23 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.574141 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:23 crc kubenswrapper[4735]: I1008 12:58:23.637106 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzg9k"] Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.193749 4735 generic.go:334] "Generic (PLEG): container finished" podID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerID="e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c" exitCode=0 Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.193978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzg9k" event={"ID":"107a4721-dfb9-4d45-abb5-5d2857e0985f","Type":"ContainerDied","Data":"e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c"} Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.194103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzg9k" event={"ID":"107a4721-dfb9-4d45-abb5-5d2857e0985f","Type":"ContainerStarted","Data":"9b5b3ebb843235bbaf7051bdf1f1967e2b30570e681199061f58f136b528ce6c"} Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.196343 4735 generic.go:334] "Generic (PLEG): container finished" podID="9255e2a8-81b4-4743-816c-611181271899" containerID="41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b" exitCode=0 Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.196388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerDied","Data":"41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b"} Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.196404 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerStarted","Data":"2be6eda20f4da36003f3420fe8be620239eda00fb91a81824eb70c314d1de9ff"} Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.200803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eaffa46b-d533-4d5a-affa-67b276f93582","Type":"ContainerStarted","Data":"33ddadcc982ab7c6afe059e5cd1a57b635d6b842af2a3a3b07198abf5cfaee18"} Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.204561 4735 generic.go:334] "Generic (PLEG): container finished" podID="e4c2c24e-ef81-4e1b-9503-212aa8e99730" containerID="b82d3386e05d33159a37afd4ea1aecf55146b912174fcf1e67efe995ac252dd1" exitCode=0 Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.205352 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e4c2c24e-ef81-4e1b-9503-212aa8e99730","Type":"ContainerDied","Data":"b82d3386e05d33159a37afd4ea1aecf55146b912174fcf1e67efe995ac252dd1"} Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.232376 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.232359875 podStartE2EDuration="2.232359875s" podCreationTimestamp="2025-10-08 12:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:58:24.22983324 +0000 UTC m=+154.038137784" watchObservedRunningTime="2025-10-08 12:58:24.232359875 +0000 UTC m=+154.040664419" Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.572343 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:24 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:24 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:24 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.572399 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.879497 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 12:58:24 crc kubenswrapper[4735]: I1008 12:58:24.879579 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.090503 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-csnmr" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.225184 4735 generic.go:334] "Generic (PLEG): container finished" podID="eaffa46b-d533-4d5a-affa-67b276f93582" containerID="33ddadcc982ab7c6afe059e5cd1a57b635d6b842af2a3a3b07198abf5cfaee18" exitCode=0 Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.225509 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eaffa46b-d533-4d5a-affa-67b276f93582","Type":"ContainerDied","Data":"33ddadcc982ab7c6afe059e5cd1a57b635d6b842af2a3a3b07198abf5cfaee18"} Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.545023 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.571899 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:25 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:25 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:25 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.571980 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.672000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kubelet-dir\") pod \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.672089 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kube-api-access\") pod \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\" (UID: \"e4c2c24e-ef81-4e1b-9503-212aa8e99730\") " Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.674323 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e4c2c24e-ef81-4e1b-9503-212aa8e99730" (UID: "e4c2c24e-ef81-4e1b-9503-212aa8e99730"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.679706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e4c2c24e-ef81-4e1b-9503-212aa8e99730" (UID: "e4c2c24e-ef81-4e1b-9503-212aa8e99730"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.774246 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:25 crc kubenswrapper[4735]: I1008 12:58:25.774278 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4c2c24e-ef81-4e1b-9503-212aa8e99730-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.265003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e4c2c24e-ef81-4e1b-9503-212aa8e99730","Type":"ContainerDied","Data":"8de4e2c7cffa31b690f8c5848eba0e72b7f13b9624cd4d9c96e0fbeae4528125"} Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.265070 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8de4e2c7cffa31b690f8c5848eba0e72b7f13b9624cd4d9c96e0fbeae4528125" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.265313 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.612427 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:26 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:26 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:26 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.613674 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.769961 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.891298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaffa46b-d533-4d5a-affa-67b276f93582-kube-api-access\") pod \"eaffa46b-d533-4d5a-affa-67b276f93582\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.891403 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaffa46b-d533-4d5a-affa-67b276f93582-kubelet-dir\") pod \"eaffa46b-d533-4d5a-affa-67b276f93582\" (UID: \"eaffa46b-d533-4d5a-affa-67b276f93582\") " Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.891693 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaffa46b-d533-4d5a-affa-67b276f93582-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eaffa46b-d533-4d5a-affa-67b276f93582" (UID: "eaffa46b-d533-4d5a-affa-67b276f93582"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.892033 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaffa46b-d533-4d5a-affa-67b276f93582-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.909792 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaffa46b-d533-4d5a-affa-67b276f93582-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eaffa46b-d533-4d5a-affa-67b276f93582" (UID: "eaffa46b-d533-4d5a-affa-67b276f93582"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:58:26 crc kubenswrapper[4735]: I1008 12:58:26.997425 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaffa46b-d533-4d5a-affa-67b276f93582-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 12:58:27 crc kubenswrapper[4735]: I1008 12:58:27.288996 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 12:58:27 crc kubenswrapper[4735]: I1008 12:58:27.293425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eaffa46b-d533-4d5a-affa-67b276f93582","Type":"ContainerDied","Data":"2c8c69aa42d2befe9e98041da457f7fafb2f91f498eb06e70cdefd4311dc1ea9"} Oct 08 12:58:27 crc kubenswrapper[4735]: I1008 12:58:27.293514 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8c69aa42d2befe9e98041da457f7fafb2f91f498eb06e70cdefd4311dc1ea9" Oct 08 12:58:27 crc kubenswrapper[4735]: I1008 12:58:27.570899 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:27 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:27 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:27 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:27 crc kubenswrapper[4735]: I1008 12:58:27.570973 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:28 crc kubenswrapper[4735]: I1008 12:58:28.572349 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:28 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:28 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:28 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:28 crc kubenswrapper[4735]: I1008 12:58:28.572739 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:29 crc kubenswrapper[4735]: I1008 12:58:29.571705 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:29 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:29 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:29 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:29 crc kubenswrapper[4735]: I1008 12:58:29.571781 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:30 crc kubenswrapper[4735]: I1008 12:58:30.571471 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:30 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:30 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:30 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:30 crc kubenswrapper[4735]: I1008 12:58:30.571544 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:31 crc kubenswrapper[4735]: I1008 12:58:31.571768 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:31 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 08 12:58:31 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:31 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:31 crc kubenswrapper[4735]: I1008 12:58:31.572046 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:32 crc kubenswrapper[4735]: I1008 12:58:32.083859 4735 patch_prober.go:28] interesting pod/console-f9d7485db-nnp5j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 08 12:58:32 crc kubenswrapper[4735]: I1008 12:58:32.084005 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nnp5j" podUID="ecbba0bb-2230-4624-8185-5d116632343f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 08 12:58:32 crc kubenswrapper[4735]: I1008 12:58:32.571724 4735 patch_prober.go:28] interesting pod/router-default-5444994796-jgkq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 12:58:32 crc kubenswrapper[4735]: [+]has-synced ok Oct 08 12:58:32 crc kubenswrapper[4735]: [+]process-running ok Oct 08 12:58:32 crc kubenswrapper[4735]: healthz check failed Oct 08 12:58:32 crc kubenswrapper[4735]: I1008 12:58:32.571776 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jgkq6" podUID="c4f3bfe2-c8ee-4ac0-915d-2be166ccbb6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 12:58:33 crc kubenswrapper[4735]: I1008 12:58:33.265078 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:33 crc kubenswrapper[4735]: I1008 12:58:33.265552 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:33 crc kubenswrapper[4735]: I1008 12:58:33.265129 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-887mh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 08 12:58:33 crc kubenswrapper[4735]: I1008 12:58:33.265658 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-887mh" podUID="14f021da-25d5-4a75-82f3-b6c20c65dc7c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 08 12:58:33 crc kubenswrapper[4735]: I1008 12:58:33.573157 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:33 crc kubenswrapper[4735]: I1008 12:58:33.579817 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jgkq6" Oct 08 12:58:34 crc kubenswrapper[4735]: I1008 12:58:34.317974 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:58:34 crc kubenswrapper[4735]: I1008 12:58:34.324113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771d7ef1-3d58-44eb-89b5-43a5da3e8091-metrics-certs\") pod \"network-metrics-daemon-mldxj\" (UID: \"771d7ef1-3d58-44eb-89b5-43a5da3e8091\") " pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:58:34 crc kubenswrapper[4735]: I1008 12:58:34.426498 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mldxj" Oct 08 12:58:40 crc kubenswrapper[4735]: I1008 12:58:40.530204 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 12:58:42 crc kubenswrapper[4735]: I1008 12:58:42.094315 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:42 crc kubenswrapper[4735]: I1008 12:58:42.097639 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 12:58:43 crc kubenswrapper[4735]: I1008 12:58:43.282673 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-887mh" Oct 08 12:58:52 crc kubenswrapper[4735]: I1008 12:58:52.279469 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lrcx8" Oct 08 12:58:54 crc kubenswrapper[4735]: I1008 12:58:54.878989 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 12:58:54 crc kubenswrapper[4735]: I1008 12:58:54.879113 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 12:58:56 crc kubenswrapper[4735]: E1008 12:58:56.082756 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 12:58:56 crc kubenswrapper[4735]: E1008 12:58:56.083094 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vz4s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xsk8d_openshift-marketplace(2f7c6d58-4dc2-4981-9722-0dd6cac9dea3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:58:56 crc kubenswrapper[4735]: E1008 12:58:56.084462 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xsk8d" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" Oct 08 12:58:59 crc kubenswrapper[4735]: I1008 12:58:59.260267 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 12:58:59 crc kubenswrapper[4735]: E1008 12:58:59.525769 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 12:58:59 crc kubenswrapper[4735]: E1008 12:58:59.525941 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qtm22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qm4jg_openshift-marketplace(2ed19bd4-2ac1-495b-ac83-76fd9393dcdb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:58:59 crc kubenswrapper[4735]: E1008 12:58:59.528687 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qm4jg" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" Oct 08 12:59:00 crc kubenswrapper[4735]: E1008 12:59:00.774109 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qm4jg" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" Oct 08 12:59:00 crc kubenswrapper[4735]: E1008 12:59:00.774682 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xsk8d" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" Oct 08 12:59:04 crc kubenswrapper[4735]: E1008 12:59:04.401896 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 12:59:04 crc kubenswrapper[4735]: E1008 12:59:04.402408 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xtd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-d5hv5_openshift-marketplace(33b36b08-22c3-49b3-805e-bd38f2235a1e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:59:04 crc kubenswrapper[4735]: E1008 12:59:04.403681 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-d5hv5" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" Oct 08 12:59:08 crc kubenswrapper[4735]: E1008 12:59:08.446752 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-d5hv5" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" Oct 08 12:59:08 crc kubenswrapper[4735]: E1008 12:59:08.716380 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 08 12:59:08 crc kubenswrapper[4735]: E1008 12:59:08.716578 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vqp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mmrz8_openshift-marketplace(9255e2a8-81b4-4743-816c-611181271899): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:59:08 crc kubenswrapper[4735]: E1008 12:59:08.719236 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mmrz8" podUID="9255e2a8-81b4-4743-816c-611181271899" Oct 08 12:59:20 crc kubenswrapper[4735]: I1008 12:59:20.132938 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mldxj"] Oct 08 12:59:20 crc kubenswrapper[4735]: I1008 12:59:20.630581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mldxj" event={"ID":"771d7ef1-3d58-44eb-89b5-43a5da3e8091","Type":"ContainerStarted","Data":"f74df04759b7be0d9afc3695f89b1ac15e8452d495e478da701f8b56e86d1cbf"} Oct 08 12:59:20 crc kubenswrapper[4735]: I1008 12:59:20.631179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mldxj" event={"ID":"771d7ef1-3d58-44eb-89b5-43a5da3e8091","Type":"ContainerStarted","Data":"edf83c36b43123ab4809bb95a7521394d3512aa5ef27c13b315124d97c42b3ff"} Oct 08 12:59:22 crc kubenswrapper[4735]: E1008 12:59:22.089048 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 12:59:22 crc kubenswrapper[4735]: E1008 12:59:22.089575 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r79sh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tct8h_openshift-marketplace(6a47c541-ef48-4097-843e-609ef561cadc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:59:22 crc kubenswrapper[4735]: E1008 12:59:22.090683 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tct8h" podUID="6a47c541-ef48-4097-843e-609ef561cadc" Oct 08 12:59:24 crc kubenswrapper[4735]: E1008 12:59:24.798526 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 12:59:24 crc kubenswrapper[4735]: E1008 12:59:24.798757 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5blsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rzs56_openshift-marketplace(fb3785a1-c82f-48bd-8032-dee94a15b75a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:59:24 crc kubenswrapper[4735]: E1008 12:59:24.799976 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rzs56" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" Oct 08 12:59:24 crc kubenswrapper[4735]: E1008 12:59:24.832970 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 12:59:24 crc kubenswrapper[4735]: E1008 12:59:24.833189 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76nn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mgkfd_openshift-marketplace(cbe9af65-d05a-4224-a554-361d0d7e9e11): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:59:24 crc kubenswrapper[4735]: E1008 12:59:24.834787 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mgkfd" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" Oct 08 12:59:24 crc kubenswrapper[4735]: I1008 12:59:24.879064 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 12:59:24 crc kubenswrapper[4735]: I1008 12:59:24.879134 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 12:59:24 crc kubenswrapper[4735]: I1008 12:59:24.879187 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 12:59:24 crc kubenswrapper[4735]: I1008 12:59:24.879769 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 12:59:24 crc kubenswrapper[4735]: I1008 12:59:24.879865 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9" gracePeriod=600 Oct 08 12:59:25 crc kubenswrapper[4735]: E1008 12:59:25.435593 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 08 12:59:25 crc kubenswrapper[4735]: E1008 12:59:25.436141 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lgfd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bzg9k_openshift-marketplace(107a4721-dfb9-4d45-abb5-5d2857e0985f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 12:59:25 crc kubenswrapper[4735]: E1008 12:59:25.437366 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bzg9k" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" Oct 08 12:59:25 crc kubenswrapper[4735]: E1008 12:59:25.495872 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tct8h" podUID="6a47c541-ef48-4097-843e-609ef561cadc" Oct 08 12:59:25 crc kubenswrapper[4735]: I1008 12:59:25.660637 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9" exitCode=0 Oct 08 12:59:25 crc kubenswrapper[4735]: I1008 12:59:25.660758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9"} Oct 08 12:59:26 crc kubenswrapper[4735]: E1008 12:59:26.865355 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mgkfd" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" Oct 08 12:59:26 crc kubenswrapper[4735]: E1008 12:59:26.865366 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rzs56" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" Oct 08 12:59:26 crc kubenswrapper[4735]: E1008 12:59:26.865453 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bzg9k" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.688789 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerID="ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce" exitCode=0 Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.688866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qm4jg" event={"ID":"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb","Type":"ContainerDied","Data":"ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce"} Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.691658 4735 generic.go:334] "Generic (PLEG): container finished" podID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerID="f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16" exitCode=0 Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.691751 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5hv5" event={"ID":"33b36b08-22c3-49b3-805e-bd38f2235a1e","Type":"ContainerDied","Data":"f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16"} Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.697638 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"2a7f255e777881316b10062a37c3969614ae340b33847b5020f89127e16dd753"} Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.700207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerStarted","Data":"407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632"} Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.704347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mldxj" event={"ID":"771d7ef1-3d58-44eb-89b5-43a5da3e8091","Type":"ContainerStarted","Data":"9ac012f4ee177e5d5112d3ff904c553da6452548b0de20f6bb1f831e26d7657d"} Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.708614 4735 generic.go:334] "Generic (PLEG): container finished" podID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerID="ef2aa9d5a89b417bb8bb400cf791e08f791b5242a0cbe9d7532b3a5b1a7c631e" exitCode=0 Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.708721 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8d" event={"ID":"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3","Type":"ContainerDied","Data":"ef2aa9d5a89b417bb8bb400cf791e08f791b5242a0cbe9d7532b3a5b1a7c631e"} Oct 08 12:59:27 crc kubenswrapper[4735]: I1008 12:59:27.730898 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mldxj" podStartSLOduration=196.730875256 podStartE2EDuration="3m16.730875256s" podCreationTimestamp="2025-10-08 12:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 12:59:27.72595459 +0000 UTC m=+217.534259134" watchObservedRunningTime="2025-10-08 12:59:27.730875256 +0000 UTC m=+217.539179800" Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.715416 4735 generic.go:334] "Generic (PLEG): container finished" podID="9255e2a8-81b4-4743-816c-611181271899" containerID="407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632" exitCode=0 Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.715502 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerDied","Data":"407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632"} Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.716834 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerStarted","Data":"157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d"} Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.718669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8d" event={"ID":"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3","Type":"ContainerStarted","Data":"10cd7accf262c787a40bf20cc0ab1b7d848a32c8852ec653fe843203fd3b90db"} Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.724722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qm4jg" event={"ID":"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb","Type":"ContainerStarted","Data":"0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897"} Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.726969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5hv5" event={"ID":"33b36b08-22c3-49b3-805e-bd38f2235a1e","Type":"ContainerStarted","Data":"b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0"} Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.740216 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mmrz8" podStartSLOduration=2.7510266789999998 podStartE2EDuration="1m6.740195504s" podCreationTimestamp="2025-10-08 12:58:22 +0000 UTC" firstStartedPulling="2025-10-08 12:58:24.203675667 +0000 UTC m=+154.011980211" lastFinishedPulling="2025-10-08 12:59:28.192844492 +0000 UTC m=+218.001149036" observedRunningTime="2025-10-08 12:59:28.739075265 +0000 UTC m=+218.547379809" watchObservedRunningTime="2025-10-08 12:59:28.740195504 +0000 UTC m=+218.548500048" Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.757279 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qm4jg" podStartSLOduration=2.711281827 podStartE2EDuration="1m7.757249203s" podCreationTimestamp="2025-10-08 12:58:21 +0000 UTC" firstStartedPulling="2025-10-08 12:58:23.120223541 +0000 UTC m=+152.928528085" lastFinishedPulling="2025-10-08 12:59:28.166190917 +0000 UTC m=+217.974495461" observedRunningTime="2025-10-08 12:59:28.752551172 +0000 UTC m=+218.560855716" watchObservedRunningTime="2025-10-08 12:59:28.757249203 +0000 UTC m=+218.565553747" Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.774243 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xsk8d" podStartSLOduration=2.672751626 podStartE2EDuration="1m7.77421332s" podCreationTimestamp="2025-10-08 12:58:21 +0000 UTC" firstStartedPulling="2025-10-08 12:58:23.119755609 +0000 UTC m=+152.928060153" lastFinishedPulling="2025-10-08 12:59:28.221217303 +0000 UTC m=+218.029521847" observedRunningTime="2025-10-08 12:59:28.770510184 +0000 UTC m=+218.578814738" watchObservedRunningTime="2025-10-08 12:59:28.77421332 +0000 UTC m=+218.582517864" Oct 08 12:59:28 crc kubenswrapper[4735]: I1008 12:59:28.803447 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5hv5" podStartSLOduration=3.661834035 podStartE2EDuration="1m9.803430401s" podCreationTimestamp="2025-10-08 12:58:19 +0000 UTC" firstStartedPulling="2025-10-08 12:58:21.971507875 +0000 UTC m=+151.779812419" lastFinishedPulling="2025-10-08 12:59:28.113104241 +0000 UTC m=+217.921408785" observedRunningTime="2025-10-08 12:59:28.800565568 +0000 UTC m=+218.608870132" watchObservedRunningTime="2025-10-08 12:59:28.803430401 +0000 UTC m=+218.611734945" Oct 08 12:59:29 crc kubenswrapper[4735]: I1008 12:59:29.737931 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:59:29 crc kubenswrapper[4735]: I1008 12:59:29.737976 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:59:30 crc kubenswrapper[4735]: I1008 12:59:30.895065 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-d5hv5" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="registry-server" probeResult="failure" output=< Oct 08 12:59:30 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 12:59:30 crc kubenswrapper[4735]: > Oct 08 12:59:31 crc kubenswrapper[4735]: I1008 12:59:31.626075 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:59:31 crc kubenswrapper[4735]: I1008 12:59:31.626405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:59:31 crc kubenswrapper[4735]: I1008 12:59:31.680134 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:59:32 crc kubenswrapper[4735]: I1008 12:59:32.175761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:59:32 crc kubenswrapper[4735]: I1008 12:59:32.175823 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:59:32 crc kubenswrapper[4735]: I1008 12:59:32.214720 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:59:32 crc kubenswrapper[4735]: I1008 12:59:32.851982 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:59:32 crc kubenswrapper[4735]: I1008 12:59:32.852405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:59:33 crc kubenswrapper[4735]: I1008 12:59:33.889098 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmrz8" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="registry-server" probeResult="failure" output=< Oct 08 12:59:33 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 12:59:33 crc kubenswrapper[4735]: > Oct 08 12:59:39 crc kubenswrapper[4735]: I1008 12:59:39.795354 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:59:39 crc kubenswrapper[4735]: I1008 12:59:39.861421 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 12:59:41 crc kubenswrapper[4735]: I1008 12:59:41.684811 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 12:59:42 crc kubenswrapper[4735]: I1008 12:59:42.225547 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:59:42 crc kubenswrapper[4735]: I1008 12:59:42.890047 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:59:42 crc kubenswrapper[4735]: I1008 12:59:42.934953 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 12:59:44 crc kubenswrapper[4735]: I1008 12:59:44.153209 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8d"] Oct 08 12:59:44 crc kubenswrapper[4735]: I1008 12:59:44.153470 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xsk8d" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="registry-server" containerID="cri-o://10cd7accf262c787a40bf20cc0ab1b7d848a32c8852ec653fe843203fd3b90db" gracePeriod=2 Oct 08 12:59:44 crc kubenswrapper[4735]: I1008 12:59:44.830984 4735 generic.go:334] "Generic (PLEG): container finished" podID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerID="10cd7accf262c787a40bf20cc0ab1b7d848a32c8852ec653fe843203fd3b90db" exitCode=0 Oct 08 12:59:44 crc kubenswrapper[4735]: I1008 12:59:44.831038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8d" event={"ID":"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3","Type":"ContainerDied","Data":"10cd7accf262c787a40bf20cc0ab1b7d848a32c8852ec653fe843203fd3b90db"} Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.946590 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.955012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-catalog-content\") pod \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.955368 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-utilities\") pod \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.956360 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz4s9\" (UniqueName: \"kubernetes.io/projected/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-kube-api-access-vz4s9\") pod \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\" (UID: \"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3\") " Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.956443 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-utilities" (OuterVolumeSpecName: "utilities") pod "2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" (UID: "2f7c6d58-4dc2-4981-9722-0dd6cac9dea3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.956978 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.970141 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-kube-api-access-vz4s9" (OuterVolumeSpecName: "kube-api-access-vz4s9") pod "2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" (UID: "2f7c6d58-4dc2-4981-9722-0dd6cac9dea3"). InnerVolumeSpecName "kube-api-access-vz4s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 12:59:46 crc kubenswrapper[4735]: I1008 12:59:46.978229 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" (UID: "2f7c6d58-4dc2-4981-9722-0dd6cac9dea3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.057713 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz4s9\" (UniqueName: \"kubernetes.io/projected/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-kube-api-access-vz4s9\") on node \"crc\" DevicePath \"\"" Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.057748 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.851792 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8d" event={"ID":"2f7c6d58-4dc2-4981-9722-0dd6cac9dea3","Type":"ContainerDied","Data":"276e982bb04288df9752696c01e1a8e87d7a19fcf811e79272242e9b4d10ae73"} Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.851902 4735 scope.go:117] "RemoveContainer" containerID="10cd7accf262c787a40bf20cc0ab1b7d848a32c8852ec653fe843203fd3b90db" Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.852128 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8d" Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.892515 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8d"] Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.894673 4735 scope.go:117] "RemoveContainer" containerID="ef2aa9d5a89b417bb8bb400cf791e08f791b5242a0cbe9d7532b3a5b1a7c631e" Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.902974 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8d"] Oct 08 12:59:47 crc kubenswrapper[4735]: I1008 12:59:47.918125 4735 scope.go:117] "RemoveContainer" containerID="dbc7f4702201b962505aa2f2de838bca6bf182373d7c7e657ca1ff322f4ad283" Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.862667 4735 generic.go:334] "Generic (PLEG): container finished" podID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerID="15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11" exitCode=0 Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.862712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzg9k" event={"ID":"107a4721-dfb9-4d45-abb5-5d2857e0985f","Type":"ContainerDied","Data":"15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11"} Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.866281 4735 generic.go:334] "Generic (PLEG): container finished" podID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerID="87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08" exitCode=0 Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.866327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzs56" event={"ID":"fb3785a1-c82f-48bd-8032-dee94a15b75a","Type":"ContainerDied","Data":"87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08"} Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.870291 4735 generic.go:334] "Generic (PLEG): container finished" podID="6a47c541-ef48-4097-843e-609ef561cadc" containerID="f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148" exitCode=0 Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.870340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tct8h" event={"ID":"6a47c541-ef48-4097-843e-609ef561cadc","Type":"ContainerDied","Data":"f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148"} Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.874961 4735 generic.go:334] "Generic (PLEG): container finished" podID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerID="a2112ac60fe10b2c45efba2c3ebbf767bc662ced2f5e90cc48b17554cea685e1" exitCode=0 Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.875016 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgkfd" event={"ID":"cbe9af65-d05a-4224-a554-361d0d7e9e11","Type":"ContainerDied","Data":"a2112ac60fe10b2c45efba2c3ebbf767bc662ced2f5e90cc48b17554cea685e1"} Oct 08 12:59:48 crc kubenswrapper[4735]: I1008 12:59:48.909939 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" path="/var/lib/kubelet/pods/2f7c6d58-4dc2-4981-9722-0dd6cac9dea3/volumes" Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.900385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzs56" event={"ID":"fb3785a1-c82f-48bd-8032-dee94a15b75a","Type":"ContainerStarted","Data":"cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4"} Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.904335 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzg9k" event={"ID":"107a4721-dfb9-4d45-abb5-5d2857e0985f","Type":"ContainerStarted","Data":"cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d"} Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.906600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tct8h" event={"ID":"6a47c541-ef48-4097-843e-609ef561cadc","Type":"ContainerStarted","Data":"1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4"} Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.910610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgkfd" event={"ID":"cbe9af65-d05a-4224-a554-361d0d7e9e11","Type":"ContainerStarted","Data":"39acd95ac2ddb1562b2229356b6058943b28dc1b6f5e1eacc49c0f312ebf9c9d"} Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.926323 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rzs56" podStartSLOduration=4.033722704 podStartE2EDuration="1m31.926307288s" podCreationTimestamp="2025-10-08 12:58:19 +0000 UTC" firstStartedPulling="2025-10-08 12:58:21.971913155 +0000 UTC m=+151.780217709" lastFinishedPulling="2025-10-08 12:59:49.864497729 +0000 UTC m=+239.672802293" observedRunningTime="2025-10-08 12:59:50.92329604 +0000 UTC m=+240.731600594" watchObservedRunningTime="2025-10-08 12:59:50.926307288 +0000 UTC m=+240.734611832" Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.942298 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tct8h" podStartSLOduration=3.862708055 podStartE2EDuration="1m31.942273898s" podCreationTimestamp="2025-10-08 12:58:19 +0000 UTC" firstStartedPulling="2025-10-08 12:58:22.000391478 +0000 UTC m=+151.808696032" lastFinishedPulling="2025-10-08 12:59:50.079957341 +0000 UTC m=+239.888261875" observedRunningTime="2025-10-08 12:59:50.939589029 +0000 UTC m=+240.747893583" watchObservedRunningTime="2025-10-08 12:59:50.942273898 +0000 UTC m=+240.750578442" Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.958249 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzg9k" podStartSLOduration=3.163193803 podStartE2EDuration="1m28.958230819s" podCreationTimestamp="2025-10-08 12:58:22 +0000 UTC" firstStartedPulling="2025-10-08 12:58:24.199305404 +0000 UTC m=+154.007609948" lastFinishedPulling="2025-10-08 12:59:49.99434238 +0000 UTC m=+239.802646964" observedRunningTime="2025-10-08 12:59:50.957648664 +0000 UTC m=+240.765953228" watchObservedRunningTime="2025-10-08 12:59:50.958230819 +0000 UTC m=+240.766535363" Oct 08 12:59:50 crc kubenswrapper[4735]: I1008 12:59:50.977079 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mgkfd" podStartSLOduration=3.763232166 podStartE2EDuration="1m31.977061604s" podCreationTimestamp="2025-10-08 12:58:19 +0000 UTC" firstStartedPulling="2025-10-08 12:58:22.064950049 +0000 UTC m=+151.873254593" lastFinishedPulling="2025-10-08 12:59:50.278779487 +0000 UTC m=+240.087084031" observedRunningTime="2025-10-08 12:59:50.974273452 +0000 UTC m=+240.782577996" watchObservedRunningTime="2025-10-08 12:59:50.977061604 +0000 UTC m=+240.785366148" Oct 08 12:59:53 crc kubenswrapper[4735]: I1008 12:59:53.258010 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:59:53 crc kubenswrapper[4735]: I1008 12:59:53.258540 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 12:59:54 crc kubenswrapper[4735]: I1008 12:59:54.303959 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzg9k" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="registry-server" probeResult="failure" output=< Oct 08 12:59:54 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 12:59:54 crc kubenswrapper[4735]: > Oct 08 12:59:59 crc kubenswrapper[4735]: I1008 12:59:59.964981 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tct8h" Oct 08 12:59:59 crc kubenswrapper[4735]: I1008 12:59:59.965284 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tct8h" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.011910 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tct8h" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.134590 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48"] Oct 08 13:00:00 crc kubenswrapper[4735]: E1008 13:00:00.135143 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="extract-content" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135230 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="extract-content" Oct 08 13:00:00 crc kubenswrapper[4735]: E1008 13:00:00.135302 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="extract-utilities" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135361 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="extract-utilities" Oct 08 13:00:00 crc kubenswrapper[4735]: E1008 13:00:00.135418 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaffa46b-d533-4d5a-affa-67b276f93582" containerName="pruner" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135485 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaffa46b-d533-4d5a-affa-67b276f93582" containerName="pruner" Oct 08 13:00:00 crc kubenswrapper[4735]: E1008 13:00:00.135554 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c2c24e-ef81-4e1b-9503-212aa8e99730" containerName="pruner" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135618 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c2c24e-ef81-4e1b-9503-212aa8e99730" containerName="pruner" Oct 08 13:00:00 crc kubenswrapper[4735]: E1008 13:00:00.135675 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="registry-server" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135732 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="registry-server" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135919 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaffa46b-d533-4d5a-affa-67b276f93582" containerName="pruner" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.135984 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c2c24e-ef81-4e1b-9503-212aa8e99730" containerName="pruner" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.136046 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7c6d58-4dc2-4981-9722-0dd6cac9dea3" containerName="registry-server" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.136495 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.142326 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48"] Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.143486 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.143523 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.162254 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.162461 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.256911 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.270629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rzs56" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.271104 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rzs56" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.306520 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rzs56" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.319396 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxtn6\" (UniqueName: \"kubernetes.io/projected/30287a9c-9d15-434b-95a4-549d6cbe491b-kube-api-access-nxtn6\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.320525 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30287a9c-9d15-434b-95a4-549d6cbe491b-secret-volume\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.320755 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30287a9c-9d15-434b-95a4-549d6cbe491b-config-volume\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.423932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30287a9c-9d15-434b-95a4-549d6cbe491b-secret-volume\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.424011 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30287a9c-9d15-434b-95a4-549d6cbe491b-config-volume\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.424099 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxtn6\" (UniqueName: \"kubernetes.io/projected/30287a9c-9d15-434b-95a4-549d6cbe491b-kube-api-access-nxtn6\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.425369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30287a9c-9d15-434b-95a4-549d6cbe491b-config-volume\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.430211 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30287a9c-9d15-434b-95a4-549d6cbe491b-secret-volume\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.442619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxtn6\" (UniqueName: \"kubernetes.io/projected/30287a9c-9d15-434b-95a4-549d6cbe491b-kube-api-access-nxtn6\") pod \"collect-profiles-29332140-nbw48\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.534463 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.914794 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48"] Oct 08 13:00:00 crc kubenswrapper[4735]: I1008 13:00:00.972410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" event={"ID":"30287a9c-9d15-434b-95a4-549d6cbe491b","Type":"ContainerStarted","Data":"03c184bc76c76134e8c17a6218f53c23f755eaf1cff9fe0cf9ab46f634c605aa"} Oct 08 13:00:01 crc kubenswrapper[4735]: I1008 13:00:01.009823 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rzs56" Oct 08 13:00:01 crc kubenswrapper[4735]: I1008 13:00:01.010682 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tct8h" Oct 08 13:00:01 crc kubenswrapper[4735]: I1008 13:00:01.016560 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 13:00:01 crc kubenswrapper[4735]: I1008 13:00:01.238409 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzs56"] Oct 08 13:00:02 crc kubenswrapper[4735]: I1008 13:00:02.987284 4735 generic.go:334] "Generic (PLEG): container finished" podID="30287a9c-9d15-434b-95a4-549d6cbe491b" containerID="13f11ef6a7c873775b11fefc2a1cc60906dbaf3cb69cbea566f5676133120596" exitCode=0 Oct 08 13:00:02 crc kubenswrapper[4735]: I1008 13:00:02.987342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" event={"ID":"30287a9c-9d15-434b-95a4-549d6cbe491b","Type":"ContainerDied","Data":"13f11ef6a7c873775b11fefc2a1cc60906dbaf3cb69cbea566f5676133120596"} Oct 08 13:00:02 crc kubenswrapper[4735]: I1008 13:00:02.987715 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rzs56" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="registry-server" containerID="cri-o://cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4" gracePeriod=2 Oct 08 13:00:03 crc kubenswrapper[4735]: E1008 13:00:03.111265 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb3785a1_c82f_48bd_8032_dee94a15b75a.slice/crio-cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4.scope\": RecentStats: unable to find data in memory cache]" Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.294829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.336666 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.440146 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgkfd"] Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.440771 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mgkfd" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="registry-server" containerID="cri-o://39acd95ac2ddb1562b2229356b6058943b28dc1b6f5e1eacc49c0f312ebf9c9d" gracePeriod=2 Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.886818 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzs56" Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.994766 4735 generic.go:334] "Generic (PLEG): container finished" podID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerID="cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4" exitCode=0 Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.994823 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzs56" Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.994844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzs56" event={"ID":"fb3785a1-c82f-48bd-8032-dee94a15b75a","Type":"ContainerDied","Data":"cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4"} Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.994900 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzs56" event={"ID":"fb3785a1-c82f-48bd-8032-dee94a15b75a","Type":"ContainerDied","Data":"d742028ce2fc21c37aaeb6b09b7388f590dcb46b964134d2927c64c8c6b44224"} Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.994919 4735 scope.go:117] "RemoveContainer" containerID="cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4" Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.997802 4735 generic.go:334] "Generic (PLEG): container finished" podID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerID="39acd95ac2ddb1562b2229356b6058943b28dc1b6f5e1eacc49c0f312ebf9c9d" exitCode=0 Oct 08 13:00:03 crc kubenswrapper[4735]: I1008 13:00:03.998020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgkfd" event={"ID":"cbe9af65-d05a-4224-a554-361d0d7e9e11","Type":"ContainerDied","Data":"39acd95ac2ddb1562b2229356b6058943b28dc1b6f5e1eacc49c0f312ebf9c9d"} Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.009133 4735 scope.go:117] "RemoveContainer" containerID="87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.037452 4735 scope.go:117] "RemoveContainer" containerID="45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.050385 4735 scope.go:117] "RemoveContainer" containerID="cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4" Oct 08 13:00:04 crc kubenswrapper[4735]: E1008 13:00:04.051159 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4\": container with ID starting with cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4 not found: ID does not exist" containerID="cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.051280 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4"} err="failed to get container status \"cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4\": rpc error: code = NotFound desc = could not find container \"cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4\": container with ID starting with cb26abb9473fc8422f48ea395f50814bf9bb994847e9f18baea3aec7a5fbabd4 not found: ID does not exist" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.051314 4735 scope.go:117] "RemoveContainer" containerID="87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08" Oct 08 13:00:04 crc kubenswrapper[4735]: E1008 13:00:04.051754 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08\": container with ID starting with 87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08 not found: ID does not exist" containerID="87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.051783 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08"} err="failed to get container status \"87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08\": rpc error: code = NotFound desc = could not find container \"87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08\": container with ID starting with 87426dfca7dae6f97a304693bc6096d5cc1de2a185356ee3f073d37e7235bc08 not found: ID does not exist" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.051802 4735 scope.go:117] "RemoveContainer" containerID="45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7" Oct 08 13:00:04 crc kubenswrapper[4735]: E1008 13:00:04.052150 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7\": container with ID starting with 45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7 not found: ID does not exist" containerID="45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.052174 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7"} err="failed to get container status \"45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7\": rpc error: code = NotFound desc = could not find container \"45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7\": container with ID starting with 45fa7a9fe93065d8f201525f927ab55f715d7da08c6e6c1f59c4b7b0ca993ad7 not found: ID does not exist" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.074312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5blsh\" (UniqueName: \"kubernetes.io/projected/fb3785a1-c82f-48bd-8032-dee94a15b75a-kube-api-access-5blsh\") pod \"fb3785a1-c82f-48bd-8032-dee94a15b75a\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.074440 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-utilities\") pod \"fb3785a1-c82f-48bd-8032-dee94a15b75a\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.074545 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-catalog-content\") pod \"fb3785a1-c82f-48bd-8032-dee94a15b75a\" (UID: \"fb3785a1-c82f-48bd-8032-dee94a15b75a\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.076308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-utilities" (OuterVolumeSpecName: "utilities") pod "fb3785a1-c82f-48bd-8032-dee94a15b75a" (UID: "fb3785a1-c82f-48bd-8032-dee94a15b75a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.081968 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3785a1-c82f-48bd-8032-dee94a15b75a-kube-api-access-5blsh" (OuterVolumeSpecName: "kube-api-access-5blsh") pod "fb3785a1-c82f-48bd-8032-dee94a15b75a" (UID: "fb3785a1-c82f-48bd-8032-dee94a15b75a"). InnerVolumeSpecName "kube-api-access-5blsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.138526 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb3785a1-c82f-48bd-8032-dee94a15b75a" (UID: "fb3785a1-c82f-48bd-8032-dee94a15b75a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.176447 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.176513 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3785a1-c82f-48bd-8032-dee94a15b75a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.176525 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5blsh\" (UniqueName: \"kubernetes.io/projected/fb3785a1-c82f-48bd-8032-dee94a15b75a-kube-api-access-5blsh\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.236047 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.317374 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.331946 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzs56"] Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.334628 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rzs56"] Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.378370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30287a9c-9d15-434b-95a4-549d6cbe491b-config-volume\") pod \"30287a9c-9d15-434b-95a4-549d6cbe491b\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.378424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxtn6\" (UniqueName: \"kubernetes.io/projected/30287a9c-9d15-434b-95a4-549d6cbe491b-kube-api-access-nxtn6\") pod \"30287a9c-9d15-434b-95a4-549d6cbe491b\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.378444 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30287a9c-9d15-434b-95a4-549d6cbe491b-secret-volume\") pod \"30287a9c-9d15-434b-95a4-549d6cbe491b\" (UID: \"30287a9c-9d15-434b-95a4-549d6cbe491b\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.379153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30287a9c-9d15-434b-95a4-549d6cbe491b-config-volume" (OuterVolumeSpecName: "config-volume") pod "30287a9c-9d15-434b-95a4-549d6cbe491b" (UID: "30287a9c-9d15-434b-95a4-549d6cbe491b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.382383 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30287a9c-9d15-434b-95a4-549d6cbe491b-kube-api-access-nxtn6" (OuterVolumeSpecName: "kube-api-access-nxtn6") pod "30287a9c-9d15-434b-95a4-549d6cbe491b" (UID: "30287a9c-9d15-434b-95a4-549d6cbe491b"). InnerVolumeSpecName "kube-api-access-nxtn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.382804 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30287a9c-9d15-434b-95a4-549d6cbe491b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "30287a9c-9d15-434b-95a4-549d6cbe491b" (UID: "30287a9c-9d15-434b-95a4-549d6cbe491b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.479282 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-catalog-content\") pod \"cbe9af65-d05a-4224-a554-361d0d7e9e11\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.479359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-utilities\") pod \"cbe9af65-d05a-4224-a554-361d0d7e9e11\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.479497 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76nn5\" (UniqueName: \"kubernetes.io/projected/cbe9af65-d05a-4224-a554-361d0d7e9e11-kube-api-access-76nn5\") pod \"cbe9af65-d05a-4224-a554-361d0d7e9e11\" (UID: \"cbe9af65-d05a-4224-a554-361d0d7e9e11\") " Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.479727 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30287a9c-9d15-434b-95a4-549d6cbe491b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.479745 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxtn6\" (UniqueName: \"kubernetes.io/projected/30287a9c-9d15-434b-95a4-549d6cbe491b-kube-api-access-nxtn6\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.479761 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30287a9c-9d15-434b-95a4-549d6cbe491b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.480320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-utilities" (OuterVolumeSpecName: "utilities") pod "cbe9af65-d05a-4224-a554-361d0d7e9e11" (UID: "cbe9af65-d05a-4224-a554-361d0d7e9e11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.482226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe9af65-d05a-4224-a554-361d0d7e9e11-kube-api-access-76nn5" (OuterVolumeSpecName: "kube-api-access-76nn5") pod "cbe9af65-d05a-4224-a554-361d0d7e9e11" (UID: "cbe9af65-d05a-4224-a554-361d0d7e9e11"). InnerVolumeSpecName "kube-api-access-76nn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.522466 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbe9af65-d05a-4224-a554-361d0d7e9e11" (UID: "cbe9af65-d05a-4224-a554-361d0d7e9e11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.580693 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76nn5\" (UniqueName: \"kubernetes.io/projected/cbe9af65-d05a-4224-a554-361d0d7e9e11-kube-api-access-76nn5\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.580737 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.580750 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe9af65-d05a-4224-a554-361d0d7e9e11-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:04 crc kubenswrapper[4735]: I1008 13:00:04.903088 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" path="/var/lib/kubelet/pods/fb3785a1-c82f-48bd-8032-dee94a15b75a/volumes" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.008278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" event={"ID":"30287a9c-9d15-434b-95a4-549d6cbe491b","Type":"ContainerDied","Data":"03c184bc76c76134e8c17a6218f53c23f755eaf1cff9fe0cf9ab46f634c605aa"} Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.008347 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03c184bc76c76134e8c17a6218f53c23f755eaf1cff9fe0cf9ab46f634c605aa" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.008300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.012795 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgkfd" event={"ID":"cbe9af65-d05a-4224-a554-361d0d7e9e11","Type":"ContainerDied","Data":"36b8f24841d52c56bafb13573dfd90c1c40a034d86377aa7110f65b4def0c7a3"} Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.012849 4735 scope.go:117] "RemoveContainer" containerID="39acd95ac2ddb1562b2229356b6058943b28dc1b6f5e1eacc49c0f312ebf9c9d" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.012945 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgkfd" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.035455 4735 scope.go:117] "RemoveContainer" containerID="a2112ac60fe10b2c45efba2c3ebbf767bc662ced2f5e90cc48b17554cea685e1" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.037952 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgkfd"] Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.041812 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mgkfd"] Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.053052 4735 scope.go:117] "RemoveContainer" containerID="10121c60e1076212ba6ff364c40fd942364f0bc7ea176684bb52099662bbf174" Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.844595 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzg9k"] Oct 08 13:00:05 crc kubenswrapper[4735]: I1008 13:00:05.845132 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzg9k" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="registry-server" containerID="cri-o://cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d" gracePeriod=2 Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.792667 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.900563 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" path="/var/lib/kubelet/pods/cbe9af65-d05a-4224-a554-361d0d7e9e11/volumes" Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.912340 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-utilities\") pod \"107a4721-dfb9-4d45-abb5-5d2857e0985f\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.912386 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-catalog-content\") pod \"107a4721-dfb9-4d45-abb5-5d2857e0985f\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.912491 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgfd4\" (UniqueName: \"kubernetes.io/projected/107a4721-dfb9-4d45-abb5-5d2857e0985f-kube-api-access-lgfd4\") pod \"107a4721-dfb9-4d45-abb5-5d2857e0985f\" (UID: \"107a4721-dfb9-4d45-abb5-5d2857e0985f\") " Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.913216 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-utilities" (OuterVolumeSpecName: "utilities") pod "107a4721-dfb9-4d45-abb5-5d2857e0985f" (UID: "107a4721-dfb9-4d45-abb5-5d2857e0985f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.918276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107a4721-dfb9-4d45-abb5-5d2857e0985f-kube-api-access-lgfd4" (OuterVolumeSpecName: "kube-api-access-lgfd4") pod "107a4721-dfb9-4d45-abb5-5d2857e0985f" (UID: "107a4721-dfb9-4d45-abb5-5d2857e0985f"). InnerVolumeSpecName "kube-api-access-lgfd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:00:06 crc kubenswrapper[4735]: I1008 13:00:06.995714 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "107a4721-dfb9-4d45-abb5-5d2857e0985f" (UID: "107a4721-dfb9-4d45-abb5-5d2857e0985f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.014080 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.014123 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a4721-dfb9-4d45-abb5-5d2857e0985f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.014135 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgfd4\" (UniqueName: \"kubernetes.io/projected/107a4721-dfb9-4d45-abb5-5d2857e0985f-kube-api-access-lgfd4\") on node \"crc\" DevicePath \"\"" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.032037 4735 generic.go:334] "Generic (PLEG): container finished" podID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerID="cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d" exitCode=0 Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.032133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzg9k" event={"ID":"107a4721-dfb9-4d45-abb5-5d2857e0985f","Type":"ContainerDied","Data":"cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d"} Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.032222 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzg9k" event={"ID":"107a4721-dfb9-4d45-abb5-5d2857e0985f","Type":"ContainerDied","Data":"9b5b3ebb843235bbaf7051bdf1f1967e2b30570e681199061f58f136b528ce6c"} Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.032246 4735 scope.go:117] "RemoveContainer" containerID="cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.032374 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzg9k" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.047799 4735 scope.go:117] "RemoveContainer" containerID="15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.057210 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzg9k"] Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.059632 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzg9k"] Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.076852 4735 scope.go:117] "RemoveContainer" containerID="e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.090654 4735 scope.go:117] "RemoveContainer" containerID="cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d" Oct 08 13:00:07 crc kubenswrapper[4735]: E1008 13:00:07.091359 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d\": container with ID starting with cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d not found: ID does not exist" containerID="cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.091395 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d"} err="failed to get container status \"cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d\": rpc error: code = NotFound desc = could not find container \"cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d\": container with ID starting with cb32dc83c4898d5fcca838de953fbf914a587a3e6e71f0340def4e16790ca69d not found: ID does not exist" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.091418 4735 scope.go:117] "RemoveContainer" containerID="15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11" Oct 08 13:00:07 crc kubenswrapper[4735]: E1008 13:00:07.092176 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11\": container with ID starting with 15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11 not found: ID does not exist" containerID="15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.092225 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11"} err="failed to get container status \"15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11\": rpc error: code = NotFound desc = could not find container \"15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11\": container with ID starting with 15bf0e8401118308fa2a1f54f676c1ce10f82819033b0b4fcdc11e33d6166d11 not found: ID does not exist" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.092254 4735 scope.go:117] "RemoveContainer" containerID="e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c" Oct 08 13:00:07 crc kubenswrapper[4735]: E1008 13:00:07.093581 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c\": container with ID starting with e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c not found: ID does not exist" containerID="e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c" Oct 08 13:00:07 crc kubenswrapper[4735]: I1008 13:00:07.093822 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c"} err="failed to get container status \"e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c\": rpc error: code = NotFound desc = could not find container \"e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c\": container with ID starting with e7df6263ac15a0ff55d9f5dd3a56a4251d6b68e2f3835aeb40a638b4b1252e8c not found: ID does not exist" Oct 08 13:00:08 crc kubenswrapper[4735]: I1008 13:00:08.899712 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" path="/var/lib/kubelet/pods/107a4721-dfb9-4d45-abb5-5d2857e0985f/volumes" Oct 08 13:00:43 crc kubenswrapper[4735]: I1008 13:00:43.776695 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqfrr"] Oct 08 13:01:08 crc kubenswrapper[4735]: I1008 13:01:08.804547 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" podUID="6e891ada-e4a4-4e15-8ade-48d0f40518da" containerName="oauth-openshift" containerID="cri-o://48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda" gracePeriod=15 Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.168151 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.206972 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75d76c97c8-sw8wc"] Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207244 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30287a9c-9d15-434b-95a4-549d6cbe491b" containerName="collect-profiles" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207263 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="30287a9c-9d15-434b-95a4-549d6cbe491b" containerName="collect-profiles" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207273 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="extract-utilities" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207282 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="extract-utilities" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207297 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="extract-utilities" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207305 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="extract-utilities" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207315 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207324 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207336 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207343 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207352 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207361 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207373 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e891ada-e4a4-4e15-8ade-48d0f40518da" containerName="oauth-openshift" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207381 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e891ada-e4a4-4e15-8ade-48d0f40518da" containerName="oauth-openshift" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207390 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="extract-content" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207397 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="extract-content" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207407 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="extract-utilities" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207414 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="extract-utilities" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207428 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="extract-content" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207434 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="extract-content" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.207445 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="extract-content" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207452 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="extract-content" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207615 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e891ada-e4a4-4e15-8ade-48d0f40518da" containerName="oauth-openshift" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207630 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3785a1-c82f-48bd-8032-dee94a15b75a" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207641 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe9af65-d05a-4224-a554-361d0d7e9e11" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207653 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="30287a9c-9d15-434b-95a4-549d6cbe491b" containerName="collect-profiles" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.207667 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="107a4721-dfb9-4d45-abb5-5d2857e0985f" containerName="registry-server" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.208041 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.218420 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75d76c97c8-sw8wc"] Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310089 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-service-ca\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310372 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-error\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-serving-cert\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310651 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-policies\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310777 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-cliconfig\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310876 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-trusted-ca-bundle\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.310967 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-provider-selection\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-router-certs\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-dir\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311249 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-session\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311339 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-ocp-branding-template\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311820 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-login\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311914 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grpzl\" (UniqueName: \"kubernetes.io/projected/6e891ada-e4a4-4e15-8ade-48d0f40518da-kube-api-access-grpzl\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.311988 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-idp-0-file-data\") pod \"6e891ada-e4a4-4e15-8ade-48d0f40518da\" (UID: \"6e891ada-e4a4-4e15-8ade-48d0f40518da\") " Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312425 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312534 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-login\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-audit-policies\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313052 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312779 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.312810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313194 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-audit-dir\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9cml\" (UniqueName: \"kubernetes.io/projected/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-kube-api-access-w9cml\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313653 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313649 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313698 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-error\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.313840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-session\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.314028 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.314062 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.314087 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.314106 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.314130 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e891ada-e4a4-4e15-8ade-48d0f40518da-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.316703 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.317181 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.317301 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.317510 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.318065 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.318124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e891ada-e4a4-4e15-8ade-48d0f40518da-kube-api-access-grpzl" (OuterVolumeSpecName: "kube-api-access-grpzl") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "kube-api-access-grpzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.318197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.320083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.321256 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6e891ada-e4a4-4e15-8ade-48d0f40518da" (UID: "6e891ada-e4a4-4e15-8ade-48d0f40518da"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.377868 4735 generic.go:334] "Generic (PLEG): container finished" podID="6e891ada-e4a4-4e15-8ade-48d0f40518da" containerID="48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda" exitCode=0 Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.377956 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.377945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" event={"ID":"6e891ada-e4a4-4e15-8ade-48d0f40518da","Type":"ContainerDied","Data":"48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda"} Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.378151 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xqfrr" event={"ID":"6e891ada-e4a4-4e15-8ade-48d0f40518da","Type":"ContainerDied","Data":"f3c83d66d26fcbff5e4faeb9c0129a6df409b19c18df7cca4d2d62b4b0e47fde"} Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.378182 4735 scope.go:117] "RemoveContainer" containerID="48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.401488 4735 scope.go:117] "RemoveContainer" containerID="48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda" Oct 08 13:01:09 crc kubenswrapper[4735]: E1008 13:01:09.402003 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda\": container with ID starting with 48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda not found: ID does not exist" containerID="48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.402042 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda"} err="failed to get container status \"48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda\": rpc error: code = NotFound desc = could not find container \"48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda\": container with ID starting with 48e4126877d7ae7f9725482116913fa932c6bde84cdb8aa891bbf02b105c8fda not found: ID does not exist" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.409250 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqfrr"] Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.413602 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xqfrr"] Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.415850 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.415897 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-audit-policies\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.415960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416017 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-audit-dir\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416039 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9cml\" (UniqueName: \"kubernetes.io/projected/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-kube-api-access-w9cml\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416143 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-error\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416167 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-session\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416218 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416241 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-login\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416313 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416327 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416341 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416358 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416369 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416383 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grpzl\" (UniqueName: \"kubernetes.io/projected/6e891ada-e4a4-4e15-8ade-48d0f40518da-kube-api-access-grpzl\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416396 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416409 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416421 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e891ada-e4a4-4e15-8ade-48d0f40518da-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416136 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-audit-dir\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.416878 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-audit-policies\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.417097 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.417522 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.418122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.419085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.420147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-login\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.420604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.420986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-error\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.421182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-session\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.421409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.422705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.422978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.431960 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9cml\" (UniqueName: \"kubernetes.io/projected/bf0dbec1-a6d2-4759-89c4-d6cd117447c8-kube-api-access-w9cml\") pod \"oauth-openshift-75d76c97c8-sw8wc\" (UID: \"bf0dbec1-a6d2-4759-89c4-d6cd117447c8\") " pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.529456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:09 crc kubenswrapper[4735]: I1008 13:01:09.756566 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75d76c97c8-sw8wc"] Oct 08 13:01:10 crc kubenswrapper[4735]: I1008 13:01:10.386684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" event={"ID":"bf0dbec1-a6d2-4759-89c4-d6cd117447c8","Type":"ContainerStarted","Data":"c39ca7313c90e411629cc32700e0b5fa9ea6a69d7f6c7e5e4f6e59c8e212529f"} Oct 08 13:01:10 crc kubenswrapper[4735]: I1008 13:01:10.387003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" event={"ID":"bf0dbec1-a6d2-4759-89c4-d6cd117447c8","Type":"ContainerStarted","Data":"e684e463afee9fdc7d30dbc994f51c4c186c10c0dcb84134caeab27d3b1dc92f"} Oct 08 13:01:10 crc kubenswrapper[4735]: I1008 13:01:10.387023 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:10 crc kubenswrapper[4735]: I1008 13:01:10.407268 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" podStartSLOduration=27.407246384 podStartE2EDuration="27.407246384s" podCreationTimestamp="2025-10-08 13:00:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:01:10.403734414 +0000 UTC m=+320.212038968" watchObservedRunningTime="2025-10-08 13:01:10.407246384 +0000 UTC m=+320.215550948" Oct 08 13:01:10 crc kubenswrapper[4735]: I1008 13:01:10.680121 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75d76c97c8-sw8wc" Oct 08 13:01:10 crc kubenswrapper[4735]: I1008 13:01:10.903304 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e891ada-e4a4-4e15-8ade-48d0f40518da" path="/var/lib/kubelet/pods/6e891ada-e4a4-4e15-8ade-48d0f40518da/volumes" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.536492 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5hv5"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.538719 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5hv5" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="registry-server" containerID="cri-o://b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0" gracePeriod=30 Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.545628 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tct8h"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.546011 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tct8h" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="registry-server" containerID="cri-o://1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4" gracePeriod=30 Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.557142 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gp925"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.557379 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" containerID="cri-o://9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd" gracePeriod=30 Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.572278 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvnfn"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.573229 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.578503 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qm4jg"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.579063 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qm4jg" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="registry-server" containerID="cri-o://0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897" gracePeriod=30 Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.593400 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvnfn"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.597425 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmrz8"] Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.598117 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mmrz8" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="registry-server" containerID="cri-o://157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d" gracePeriod=30 Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.658468 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7932dd2c-9c74-4c9b-ab33-728a0169a580-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.658645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7932dd2c-9c74-4c9b-ab33-728a0169a580-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.658775 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbs8j\" (UniqueName: \"kubernetes.io/projected/7932dd2c-9c74-4c9b-ab33-728a0169a580-kube-api-access-tbs8j\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.760296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7932dd2c-9c74-4c9b-ab33-728a0169a580-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.760364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7932dd2c-9c74-4c9b-ab33-728a0169a580-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.760387 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbs8j\" (UniqueName: \"kubernetes.io/projected/7932dd2c-9c74-4c9b-ab33-728a0169a580-kube-api-access-tbs8j\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.761675 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7932dd2c-9c74-4c9b-ab33-728a0169a580-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.767239 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7932dd2c-9c74-4c9b-ab33-728a0169a580-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:27 crc kubenswrapper[4735]: I1008 13:01:27.786755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbs8j\" (UniqueName: \"kubernetes.io/projected/7932dd2c-9c74-4c9b-ab33-728a0169a580-kube-api-access-tbs8j\") pod \"marketplace-operator-79b997595-cvnfn\" (UID: \"7932dd2c-9c74-4c9b-ab33-728a0169a580\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.058414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.079410 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.080960 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tct8h" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.083571 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.089493 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.093299 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.267440 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r79sh\" (UniqueName: \"kubernetes.io/projected/6a47c541-ef48-4097-843e-609ef561cadc-kube-api-access-r79sh\") pod \"6a47c541-ef48-4097-843e-609ef561cadc\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.267501 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-utilities\") pod \"33b36b08-22c3-49b3-805e-bd38f2235a1e\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.267522 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-operator-metrics\") pod \"d169069b-e5fe-4018-974d-97cf1c1cad23\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.267550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5td7v\" (UniqueName: \"kubernetes.io/projected/d169069b-e5fe-4018-974d-97cf1c1cad23-kube-api-access-5td7v\") pod \"d169069b-e5fe-4018-974d-97cf1c1cad23\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.267590 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-trusted-ca\") pod \"d169069b-e5fe-4018-974d-97cf1c1cad23\" (UID: \"d169069b-e5fe-4018-974d-97cf1c1cad23\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.267616 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-catalog-content\") pod \"9255e2a8-81b4-4743-816c-611181271899\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.268382 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-utilities" (OuterVolumeSpecName: "utilities") pod "33b36b08-22c3-49b3-805e-bd38f2235a1e" (UID: "33b36b08-22c3-49b3-805e-bd38f2235a1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.269943 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d169069b-e5fe-4018-974d-97cf1c1cad23" (UID: "d169069b-e5fe-4018-974d-97cf1c1cad23"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270365 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-catalog-content\") pod \"33b36b08-22c3-49b3-805e-bd38f2235a1e\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270388 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xtd2\" (UniqueName: \"kubernetes.io/projected/33b36b08-22c3-49b3-805e-bd38f2235a1e-kube-api-access-7xtd2\") pod \"33b36b08-22c3-49b3-805e-bd38f2235a1e\" (UID: \"33b36b08-22c3-49b3-805e-bd38f2235a1e\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-utilities\") pod \"6a47c541-ef48-4097-843e-609ef561cadc\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270431 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-catalog-content\") pod \"6a47c541-ef48-4097-843e-609ef561cadc\" (UID: \"6a47c541-ef48-4097-843e-609ef561cadc\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-utilities\") pod \"9255e2a8-81b4-4743-816c-611181271899\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-catalog-content\") pod \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270501 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtm22\" (UniqueName: \"kubernetes.io/projected/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-kube-api-access-qtm22\") pod \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270521 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-utilities\") pod \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\" (UID: \"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270559 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vqp2\" (UniqueName: \"kubernetes.io/projected/9255e2a8-81b4-4743-816c-611181271899-kube-api-access-4vqp2\") pod \"9255e2a8-81b4-4743-816c-611181271899\" (UID: \"9255e2a8-81b4-4743-816c-611181271899\") " Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270711 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.270728 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.272517 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-utilities" (OuterVolumeSpecName: "utilities") pod "9255e2a8-81b4-4743-816c-611181271899" (UID: "9255e2a8-81b4-4743-816c-611181271899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.273955 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-utilities" (OuterVolumeSpecName: "utilities") pod "6a47c541-ef48-4097-843e-609ef561cadc" (UID: "6a47c541-ef48-4097-843e-609ef561cadc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.274136 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a47c541-ef48-4097-843e-609ef561cadc-kube-api-access-r79sh" (OuterVolumeSpecName: "kube-api-access-r79sh") pod "6a47c541-ef48-4097-843e-609ef561cadc" (UID: "6a47c541-ef48-4097-843e-609ef561cadc"). InnerVolumeSpecName "kube-api-access-r79sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.274217 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d169069b-e5fe-4018-974d-97cf1c1cad23" (UID: "d169069b-e5fe-4018-974d-97cf1c1cad23"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.274776 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d169069b-e5fe-4018-974d-97cf1c1cad23-kube-api-access-5td7v" (OuterVolumeSpecName: "kube-api-access-5td7v") pod "d169069b-e5fe-4018-974d-97cf1c1cad23" (UID: "d169069b-e5fe-4018-974d-97cf1c1cad23"). InnerVolumeSpecName "kube-api-access-5td7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.274817 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-kube-api-access-qtm22" (OuterVolumeSpecName: "kube-api-access-qtm22") pod "2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" (UID: "2ed19bd4-2ac1-495b-ac83-76fd9393dcdb"). InnerVolumeSpecName "kube-api-access-qtm22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.275724 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b36b08-22c3-49b3-805e-bd38f2235a1e-kube-api-access-7xtd2" (OuterVolumeSpecName: "kube-api-access-7xtd2") pod "33b36b08-22c3-49b3-805e-bd38f2235a1e" (UID: "33b36b08-22c3-49b3-805e-bd38f2235a1e"). InnerVolumeSpecName "kube-api-access-7xtd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.277971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-utilities" (OuterVolumeSpecName: "utilities") pod "2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" (UID: "2ed19bd4-2ac1-495b-ac83-76fd9393dcdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.286820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9255e2a8-81b4-4743-816c-611181271899-kube-api-access-4vqp2" (OuterVolumeSpecName: "kube-api-access-4vqp2") pod "9255e2a8-81b4-4743-816c-611181271899" (UID: "9255e2a8-81b4-4743-816c-611181271899"). InnerVolumeSpecName "kube-api-access-4vqp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.301320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" (UID: "2ed19bd4-2ac1-495b-ac83-76fd9393dcdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.303841 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvnfn"] Oct 08 13:01:28 crc kubenswrapper[4735]: W1008 13:01:28.314117 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7932dd2c_9c74_4c9b_ab33_728a0169a580.slice/crio-11235343df28c5072058bc945eaa1cd229524d1ef25950bf5677c57a3bc0026a WatchSource:0}: Error finding container 11235343df28c5072058bc945eaa1cd229524d1ef25950bf5677c57a3bc0026a: Status 404 returned error can't find the container with id 11235343df28c5072058bc945eaa1cd229524d1ef25950bf5677c57a3bc0026a Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.335824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a47c541-ef48-4097-843e-609ef561cadc" (UID: "6a47c541-ef48-4097-843e-609ef561cadc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.336059 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33b36b08-22c3-49b3-805e-bd38f2235a1e" (UID: "33b36b08-22c3-49b3-805e-bd38f2235a1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371595 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33b36b08-22c3-49b3-805e-bd38f2235a1e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371630 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xtd2\" (UniqueName: \"kubernetes.io/projected/33b36b08-22c3-49b3-805e-bd38f2235a1e-kube-api-access-7xtd2\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371644 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371655 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47c541-ef48-4097-843e-609ef561cadc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371666 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371677 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371691 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371701 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtm22\" (UniqueName: \"kubernetes.io/projected/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb-kube-api-access-qtm22\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371712 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vqp2\" (UniqueName: \"kubernetes.io/projected/9255e2a8-81b4-4743-816c-611181271899-kube-api-access-4vqp2\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371723 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r79sh\" (UniqueName: \"kubernetes.io/projected/6a47c541-ef48-4097-843e-609ef561cadc-kube-api-access-r79sh\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371735 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d169069b-e5fe-4018-974d-97cf1c1cad23-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.371747 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5td7v\" (UniqueName: \"kubernetes.io/projected/d169069b-e5fe-4018-974d-97cf1c1cad23-kube-api-access-5td7v\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.387572 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9255e2a8-81b4-4743-816c-611181271899" (UID: "9255e2a8-81b4-4743-816c-611181271899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.472504 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9255e2a8-81b4-4743-816c-611181271899-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.484149 4735 generic.go:334] "Generic (PLEG): container finished" podID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerID="b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0" exitCode=0 Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.484223 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5hv5" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.484245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5hv5" event={"ID":"33b36b08-22c3-49b3-805e-bd38f2235a1e","Type":"ContainerDied","Data":"b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.484532 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5hv5" event={"ID":"33b36b08-22c3-49b3-805e-bd38f2235a1e","Type":"ContainerDied","Data":"00895780c53dbbd3c1799ec7c22c867c482d8d4c31de9833428314f3d37115d3"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.484560 4735 scope.go:117] "RemoveContainer" containerID="b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.489288 4735 generic.go:334] "Generic (PLEG): container finished" podID="6a47c541-ef48-4097-843e-609ef561cadc" containerID="1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4" exitCode=0 Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.489356 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tct8h" event={"ID":"6a47c541-ef48-4097-843e-609ef561cadc","Type":"ContainerDied","Data":"1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.489392 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tct8h" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.489397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tct8h" event={"ID":"6a47c541-ef48-4097-843e-609ef561cadc","Type":"ContainerDied","Data":"6ccc82be60b1d17671654a0ed99093d32d68112607ce2a6e306710e2f4f852fe"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.491826 4735 generic.go:334] "Generic (PLEG): container finished" podID="9255e2a8-81b4-4743-816c-611181271899" containerID="157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d" exitCode=0 Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.491885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerDied","Data":"157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.491900 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmrz8" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.491910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmrz8" event={"ID":"9255e2a8-81b4-4743-816c-611181271899","Type":"ContainerDied","Data":"2be6eda20f4da36003f3420fe8be620239eda00fb91a81824eb70c314d1de9ff"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.494533 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerID="0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897" exitCode=0 Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.494593 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qm4jg" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.494626 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qm4jg" event={"ID":"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb","Type":"ContainerDied","Data":"0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.494672 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qm4jg" event={"ID":"2ed19bd4-2ac1-495b-ac83-76fd9393dcdb","Type":"ContainerDied","Data":"ced7da65ee89803635ef5e1268253bcf1d7dce9bba2c70df3bb067c6759198b8"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.503293 4735 scope.go:117] "RemoveContainer" containerID="f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.506158 4735 generic.go:334] "Generic (PLEG): container finished" podID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerID="9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd" exitCode=0 Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.506216 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" event={"ID":"d169069b-e5fe-4018-974d-97cf1c1cad23","Type":"ContainerDied","Data":"9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.506212 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.506242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gp925" event={"ID":"d169069b-e5fe-4018-974d-97cf1c1cad23","Type":"ContainerDied","Data":"daf0b66a40ab41c82388a090a16eb5de0ff0011b36534ed88f5d3b119e67464b"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.507717 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" event={"ID":"7932dd2c-9c74-4c9b-ab33-728a0169a580","Type":"ContainerStarted","Data":"e040f5ad4082af566a709111a1e78f8cd48950bef2074bb6d928a5d8b7c020be"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.507744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" event={"ID":"7932dd2c-9c74-4c9b-ab33-728a0169a580","Type":"ContainerStarted","Data":"11235343df28c5072058bc945eaa1cd229524d1ef25950bf5677c57a3bc0026a"} Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.508694 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.511090 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cvnfn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.511135 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" podUID="7932dd2c-9c74-4c9b-ab33-728a0169a580" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.532976 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" podStartSLOduration=1.532956539 podStartE2EDuration="1.532956539s" podCreationTimestamp="2025-10-08 13:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:01:28.524577427 +0000 UTC m=+338.332881971" watchObservedRunningTime="2025-10-08 13:01:28.532956539 +0000 UTC m=+338.341261083" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.547762 4735 scope.go:117] "RemoveContainer" containerID="0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.548657 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5hv5"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.552034 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5hv5"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.555681 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tct8h"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.562350 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tct8h"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.566769 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gp925"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.571893 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gp925"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.575943 4735 scope.go:117] "RemoveContainer" containerID="b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.576577 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0\": container with ID starting with b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0 not found: ID does not exist" containerID="b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.576614 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0"} err="failed to get container status \"b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0\": rpc error: code = NotFound desc = could not find container \"b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0\": container with ID starting with b6112f431d3cb9b36575be1a98bc788a313a852de4b7ab646aa5430750f510d0 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.576640 4735 scope.go:117] "RemoveContainer" containerID="f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.577193 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qm4jg"] Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.577192 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16\": container with ID starting with f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16 not found: ID does not exist" containerID="f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.577257 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16"} err="failed to get container status \"f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16\": rpc error: code = NotFound desc = could not find container \"f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16\": container with ID starting with f045b9691d7f76eae0a179ffdfc91a1fd085367220400ea36af4fdf11c1bde16 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.577286 4735 scope.go:117] "RemoveContainer" containerID="0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.577663 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035\": container with ID starting with 0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035 not found: ID does not exist" containerID="0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.577690 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035"} err="failed to get container status \"0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035\": rpc error: code = NotFound desc = could not find container \"0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035\": container with ID starting with 0d938b1fd656921d0733ca9fc07ae7f3361240d220dd196823e88153a8617035 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.577709 4735 scope.go:117] "RemoveContainer" containerID="1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.580031 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qm4jg"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.589104 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmrz8"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.596230 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mmrz8"] Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.597013 4735 scope.go:117] "RemoveContainer" containerID="f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.616353 4735 scope.go:117] "RemoveContainer" containerID="4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.628699 4735 scope.go:117] "RemoveContainer" containerID="1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.629281 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4\": container with ID starting with 1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4 not found: ID does not exist" containerID="1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.629315 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4"} err="failed to get container status \"1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4\": rpc error: code = NotFound desc = could not find container \"1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4\": container with ID starting with 1b337b9edf91b6e186e4a9f41d26448d67f23700647fe46366c435b0620b5ed4 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.629376 4735 scope.go:117] "RemoveContainer" containerID="f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.629937 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148\": container with ID starting with f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148 not found: ID does not exist" containerID="f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.629988 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148"} err="failed to get container status \"f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148\": rpc error: code = NotFound desc = could not find container \"f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148\": container with ID starting with f43ff6881a40cd3e8f39f3345b1018d4500543c789bcd26dcf5632ba267ec148 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.630021 4735 scope.go:117] "RemoveContainer" containerID="4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.630580 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2\": container with ID starting with 4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2 not found: ID does not exist" containerID="4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.630612 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2"} err="failed to get container status \"4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2\": rpc error: code = NotFound desc = could not find container \"4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2\": container with ID starting with 4d3c37da2dfc60a6cbcbe8085edc93d10fcfb27a51424029a29a1f04f8ace2f2 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.630633 4735 scope.go:117] "RemoveContainer" containerID="157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.643489 4735 scope.go:117] "RemoveContainer" containerID="407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.660593 4735 scope.go:117] "RemoveContainer" containerID="41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.672628 4735 scope.go:117] "RemoveContainer" containerID="157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.673267 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d\": container with ID starting with 157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d not found: ID does not exist" containerID="157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.673294 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d"} err="failed to get container status \"157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d\": rpc error: code = NotFound desc = could not find container \"157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d\": container with ID starting with 157c010bed4355c919a7f9eebc6f960cefc30931655ba08c9a234382bc2aa20d not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.673316 4735 scope.go:117] "RemoveContainer" containerID="407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.673623 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632\": container with ID starting with 407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632 not found: ID does not exist" containerID="407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.673642 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632"} err="failed to get container status \"407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632\": rpc error: code = NotFound desc = could not find container \"407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632\": container with ID starting with 407db0863f505e5a24f5e37f721b5c4d4f06cd6160e109833b90971c41320632 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.673656 4735 scope.go:117] "RemoveContainer" containerID="41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.675567 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b\": container with ID starting with 41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b not found: ID does not exist" containerID="41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.675600 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b"} err="failed to get container status \"41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b\": rpc error: code = NotFound desc = could not find container \"41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b\": container with ID starting with 41cc2caf22bee8b85d9166b2bd3ae9273f476887ae6724ab52260af38eedf89b not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.675624 4735 scope.go:117] "RemoveContainer" containerID="0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.697208 4735 scope.go:117] "RemoveContainer" containerID="ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.711671 4735 scope.go:117] "RemoveContainer" containerID="328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.723906 4735 scope.go:117] "RemoveContainer" containerID="0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.724253 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897\": container with ID starting with 0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897 not found: ID does not exist" containerID="0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.724288 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897"} err="failed to get container status \"0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897\": rpc error: code = NotFound desc = could not find container \"0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897\": container with ID starting with 0edb383fd38e0f56110d978d0e4ca7dc4a14ac3fb46fc1c8374a4c9866bff897 not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.724315 4735 scope.go:117] "RemoveContainer" containerID="ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.724672 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce\": container with ID starting with ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce not found: ID does not exist" containerID="ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.724693 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce"} err="failed to get container status \"ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce\": rpc error: code = NotFound desc = could not find container \"ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce\": container with ID starting with ed69eee8597b1d73cc583a6fbf4ccafb480b939eb37e18fd97b6fa53686837ce not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.724705 4735 scope.go:117] "RemoveContainer" containerID="328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.724947 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa\": container with ID starting with 328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa not found: ID does not exist" containerID="328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.724968 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa"} err="failed to get container status \"328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa\": rpc error: code = NotFound desc = could not find container \"328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa\": container with ID starting with 328fbac8e57991fa66b96757d45f54c5abff2c5cd3aeaa2516646763056131fa not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.724982 4735 scope.go:117] "RemoveContainer" containerID="9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.738662 4735 scope.go:117] "RemoveContainer" containerID="9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd" Oct 08 13:01:28 crc kubenswrapper[4735]: E1008 13:01:28.739004 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd\": container with ID starting with 9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd not found: ID does not exist" containerID="9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.739034 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd"} err="failed to get container status \"9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd\": rpc error: code = NotFound desc = could not find container \"9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd\": container with ID starting with 9018ee780dd3c53eb196a1d585b4c20c1e88249f5e4ecd0008800fa05035cbdd not found: ID does not exist" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.902066 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" path="/var/lib/kubelet/pods/2ed19bd4-2ac1-495b-ac83-76fd9393dcdb/volumes" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.902783 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" path="/var/lib/kubelet/pods/33b36b08-22c3-49b3-805e-bd38f2235a1e/volumes" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.903445 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a47c541-ef48-4097-843e-609ef561cadc" path="/var/lib/kubelet/pods/6a47c541-ef48-4097-843e-609ef561cadc/volumes" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.904505 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9255e2a8-81b4-4743-816c-611181271899" path="/var/lib/kubelet/pods/9255e2a8-81b4-4743-816c-611181271899/volumes" Oct 08 13:01:28 crc kubenswrapper[4735]: I1008 13:01:28.905440 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" path="/var/lib/kubelet/pods/d169069b-e5fe-4018-974d-97cf1c1cad23/volumes" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.518394 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cvnfn" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.747917 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t9ktd"] Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748147 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748162 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748175 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748184 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748196 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748204 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748214 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748223 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748231 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748238 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748250 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748258 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748270 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748278 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748291 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748298 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="extract-utilities" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748310 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748319 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748331 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748340 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748349 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748358 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748369 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748377 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: E1008 13:01:29.748385 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748392 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="extract-content" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748508 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d169069b-e5fe-4018-974d-97cf1c1cad23" containerName="marketplace-operator" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748523 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9255e2a8-81b4-4743-816c-611181271899" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748532 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b36b08-22c3-49b3-805e-bd38f2235a1e" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748543 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a47c541-ef48-4097-843e-609ef561cadc" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.748554 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed19bd4-2ac1-495b-ac83-76fd9393dcdb" containerName="registry-server" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.749373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.751646 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.774081 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t9ktd"] Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.787598 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jjc\" (UniqueName: \"kubernetes.io/projected/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-kube-api-access-r7jjc\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.787644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-utilities\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.787696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-catalog-content\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.888972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jjc\" (UniqueName: \"kubernetes.io/projected/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-kube-api-access-r7jjc\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.889045 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-utilities\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.889112 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-catalog-content\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.889632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-catalog-content\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.889673 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-utilities\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.907067 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jjc\" (UniqueName: \"kubernetes.io/projected/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-kube-api-access-r7jjc\") pod \"certified-operators-t9ktd\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.949382 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rrx2j"] Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.950555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.952301 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 13:01:29 crc kubenswrapper[4735]: I1008 13:01:29.956616 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrx2j"] Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.083339 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.091921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb750a1e-612a-4b33-b368-33c4a44c12e5-utilities\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.092031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrds\" (UniqueName: \"kubernetes.io/projected/fb750a1e-612a-4b33-b368-33c4a44c12e5-kube-api-access-dlrds\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.092565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb750a1e-612a-4b33-b368-33c4a44c12e5-catalog-content\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.193719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb750a1e-612a-4b33-b368-33c4a44c12e5-utilities\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.194015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrds\" (UniqueName: \"kubernetes.io/projected/fb750a1e-612a-4b33-b368-33c4a44c12e5-kube-api-access-dlrds\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.194048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb750a1e-612a-4b33-b368-33c4a44c12e5-catalog-content\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.194121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb750a1e-612a-4b33-b368-33c4a44c12e5-utilities\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.194367 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb750a1e-612a-4b33-b368-33c4a44c12e5-catalog-content\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.214361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrds\" (UniqueName: \"kubernetes.io/projected/fb750a1e-612a-4b33-b368-33c4a44c12e5-kube-api-access-dlrds\") pod \"community-operators-rrx2j\" (UID: \"fb750a1e-612a-4b33-b368-33c4a44c12e5\") " pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.279232 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t9ktd"] Oct 08 13:01:30 crc kubenswrapper[4735]: W1008 13:01:30.285825 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2c0c5f_5bfa_40ba_87da_87d1e498abda.slice/crio-c3d0f029ac12a74b3c615071b746d7ae44fa57d327c3e7829027cfe5695319b3 WatchSource:0}: Error finding container c3d0f029ac12a74b3c615071b746d7ae44fa57d327c3e7829027cfe5695319b3: Status 404 returned error can't find the container with id c3d0f029ac12a74b3c615071b746d7ae44fa57d327c3e7829027cfe5695319b3 Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.301171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.479312 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrx2j"] Oct 08 13:01:30 crc kubenswrapper[4735]: W1008 13:01:30.503560 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb750a1e_612a_4b33_b368_33c4a44c12e5.slice/crio-074044f3d826d642504bafce589168e01b7622aa9aea2c2b8f11cd29a4bf60d8 WatchSource:0}: Error finding container 074044f3d826d642504bafce589168e01b7622aa9aea2c2b8f11cd29a4bf60d8: Status 404 returned error can't find the container with id 074044f3d826d642504bafce589168e01b7622aa9aea2c2b8f11cd29a4bf60d8 Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.530555 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerID="60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3" exitCode=0 Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.530626 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9ktd" event={"ID":"8a2c0c5f-5bfa-40ba-87da-87d1e498abda","Type":"ContainerDied","Data":"60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3"} Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.530653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9ktd" event={"ID":"8a2c0c5f-5bfa-40ba-87da-87d1e498abda","Type":"ContainerStarted","Data":"c3d0f029ac12a74b3c615071b746d7ae44fa57d327c3e7829027cfe5695319b3"} Oct 08 13:01:30 crc kubenswrapper[4735]: I1008 13:01:30.532037 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrx2j" event={"ID":"fb750a1e-612a-4b33-b368-33c4a44c12e5","Type":"ContainerStarted","Data":"074044f3d826d642504bafce589168e01b7622aa9aea2c2b8f11cd29a4bf60d8"} Oct 08 13:01:31 crc kubenswrapper[4735]: I1008 13:01:31.540220 4735 generic.go:334] "Generic (PLEG): container finished" podID="fb750a1e-612a-4b33-b368-33c4a44c12e5" containerID="d76ab523a40392ebe3940e91adeb77077edd68ad3e6af833fe77f9a807d5c0d5" exitCode=0 Oct 08 13:01:31 crc kubenswrapper[4735]: I1008 13:01:31.540265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrx2j" event={"ID":"fb750a1e-612a-4b33-b368-33c4a44c12e5","Type":"ContainerDied","Data":"d76ab523a40392ebe3940e91adeb77077edd68ad3e6af833fe77f9a807d5c0d5"} Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.152178 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvmg"] Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.153941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.155867 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.177305 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvmg"] Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.321212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-catalog-content\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.321592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-utilities\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.321654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr7rz\" (UniqueName: \"kubernetes.io/projected/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-kube-api-access-qr7rz\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.352053 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-md7br"] Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.353561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.358392 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.362814 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-md7br"] Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.422511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-catalog-content\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.422554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptdc\" (UniqueName: \"kubernetes.io/projected/0f52b3f8-152e-440b-bfe6-67c2f10a1506-kube-api-access-8ptdc\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.422576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-utilities\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.422594 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-utilities\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.422618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-catalog-content\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.422636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr7rz\" (UniqueName: \"kubernetes.io/projected/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-kube-api-access-qr7rz\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.423100 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-utilities\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.424329 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-catalog-content\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.445679 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr7rz\" (UniqueName: \"kubernetes.io/projected/eaa99f83-97af-4aa0-b05a-a65a86a75cb6-kube-api-access-qr7rz\") pod \"redhat-marketplace-kpvmg\" (UID: \"eaa99f83-97af-4aa0-b05a-a65a86a75cb6\") " pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.484402 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.523784 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptdc\" (UniqueName: \"kubernetes.io/projected/0f52b3f8-152e-440b-bfe6-67c2f10a1506-kube-api-access-8ptdc\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.523834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-utilities\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.523866 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-catalog-content\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.524322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-catalog-content\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.524431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-utilities\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.540323 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptdc\" (UniqueName: \"kubernetes.io/projected/0f52b3f8-152e-440b-bfe6-67c2f10a1506-kube-api-access-8ptdc\") pod \"redhat-operators-md7br\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.546952 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerID="b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80" exitCode=0 Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.547008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9ktd" event={"ID":"8a2c0c5f-5bfa-40ba-87da-87d1e498abda","Type":"ContainerDied","Data":"b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80"} Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.552498 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrx2j" event={"ID":"fb750a1e-612a-4b33-b368-33c4a44c12e5","Type":"ContainerStarted","Data":"97b5c64a903821bb9ae7de34922fb4e39272e02286eac821e267e9743151f044"} Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.677894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.773526 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvmg"] Oct 08 13:01:32 crc kubenswrapper[4735]: W1008 13:01:32.790464 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa99f83_97af_4aa0_b05a_a65a86a75cb6.slice/crio-3abe254a6f5909b7ba91221c8c755e61e0eb2aea1897c2f347e541c1f4948ad3 WatchSource:0}: Error finding container 3abe254a6f5909b7ba91221c8c755e61e0eb2aea1897c2f347e541c1f4948ad3: Status 404 returned error can't find the container with id 3abe254a6f5909b7ba91221c8c755e61e0eb2aea1897c2f347e541c1f4948ad3 Oct 08 13:01:32 crc kubenswrapper[4735]: I1008 13:01:32.853319 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-md7br"] Oct 08 13:01:32 crc kubenswrapper[4735]: W1008 13:01:32.868546 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f52b3f8_152e_440b_bfe6_67c2f10a1506.slice/crio-0a8b163b9e1d53e860adef1a2dce9ace9204513d065b2611d83147baeb6e7a45 WatchSource:0}: Error finding container 0a8b163b9e1d53e860adef1a2dce9ace9204513d065b2611d83147baeb6e7a45: Status 404 returned error can't find the container with id 0a8b163b9e1d53e860adef1a2dce9ace9204513d065b2611d83147baeb6e7a45 Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.562556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9ktd" event={"ID":"8a2c0c5f-5bfa-40ba-87da-87d1e498abda","Type":"ContainerStarted","Data":"592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.565467 4735 generic.go:334] "Generic (PLEG): container finished" podID="fb750a1e-612a-4b33-b368-33c4a44c12e5" containerID="97b5c64a903821bb9ae7de34922fb4e39272e02286eac821e267e9743151f044" exitCode=0 Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.565563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrx2j" event={"ID":"fb750a1e-612a-4b33-b368-33c4a44c12e5","Type":"ContainerDied","Data":"97b5c64a903821bb9ae7de34922fb4e39272e02286eac821e267e9743151f044"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.565594 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrx2j" event={"ID":"fb750a1e-612a-4b33-b368-33c4a44c12e5","Type":"ContainerStarted","Data":"bc4faccfe92611b294566b1d5feee879bcda1040b7b719e3cbd1bda61e9bd8b4"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.567430 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerID="97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98" exitCode=0 Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.567508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerDied","Data":"97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.567541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerStarted","Data":"0a8b163b9e1d53e860adef1a2dce9ace9204513d065b2611d83147baeb6e7a45"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.570684 4735 generic.go:334] "Generic (PLEG): container finished" podID="eaa99f83-97af-4aa0-b05a-a65a86a75cb6" containerID="17feeea0bbf25d194d05525d83fc979eba3f3ee6783254a0b8e379c8a3a8ce68" exitCode=0 Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.570732 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvmg" event={"ID":"eaa99f83-97af-4aa0-b05a-a65a86a75cb6","Type":"ContainerDied","Data":"17feeea0bbf25d194d05525d83fc979eba3f3ee6783254a0b8e379c8a3a8ce68"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.570758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvmg" event={"ID":"eaa99f83-97af-4aa0-b05a-a65a86a75cb6","Type":"ContainerStarted","Data":"3abe254a6f5909b7ba91221c8c755e61e0eb2aea1897c2f347e541c1f4948ad3"} Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.595642 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t9ktd" podStartSLOduration=2.114994596 podStartE2EDuration="4.595602975s" podCreationTimestamp="2025-10-08 13:01:29 +0000 UTC" firstStartedPulling="2025-10-08 13:01:30.53176302 +0000 UTC m=+340.340067564" lastFinishedPulling="2025-10-08 13:01:33.012371399 +0000 UTC m=+342.820675943" observedRunningTime="2025-10-08 13:01:33.585207477 +0000 UTC m=+343.393512011" watchObservedRunningTime="2025-10-08 13:01:33.595602975 +0000 UTC m=+343.403907519" Oct 08 13:01:33 crc kubenswrapper[4735]: I1008 13:01:33.606553 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rrx2j" podStartSLOduration=2.7617316880000002 podStartE2EDuration="4.606536817s" podCreationTimestamp="2025-10-08 13:01:29 +0000 UTC" firstStartedPulling="2025-10-08 13:01:31.543499637 +0000 UTC m=+341.351804181" lastFinishedPulling="2025-10-08 13:01:33.388304766 +0000 UTC m=+343.196609310" observedRunningTime="2025-10-08 13:01:33.602731582 +0000 UTC m=+343.411036146" watchObservedRunningTime="2025-10-08 13:01:33.606536817 +0000 UTC m=+343.414841361" Oct 08 13:01:34 crc kubenswrapper[4735]: I1008 13:01:34.580084 4735 generic.go:334] "Generic (PLEG): container finished" podID="eaa99f83-97af-4aa0-b05a-a65a86a75cb6" containerID="7328d82c9189c40ec4a08cd1c2fd336518d856b133b45f305146374f062a2296" exitCode=0 Oct 08 13:01:34 crc kubenswrapper[4735]: I1008 13:01:34.580157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvmg" event={"ID":"eaa99f83-97af-4aa0-b05a-a65a86a75cb6","Type":"ContainerDied","Data":"7328d82c9189c40ec4a08cd1c2fd336518d856b133b45f305146374f062a2296"} Oct 08 13:01:36 crc kubenswrapper[4735]: I1008 13:01:36.593235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvmg" event={"ID":"eaa99f83-97af-4aa0-b05a-a65a86a75cb6","Type":"ContainerStarted","Data":"f4188ef1f1c3530a77c786af30867cf3cec05429a2f2bb71ec06345dc439033f"} Oct 08 13:01:36 crc kubenswrapper[4735]: I1008 13:01:36.595100 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerStarted","Data":"99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85"} Oct 08 13:01:36 crc kubenswrapper[4735]: I1008 13:01:36.618631 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kpvmg" podStartSLOduration=2.424795218 podStartE2EDuration="4.618610308s" podCreationTimestamp="2025-10-08 13:01:32 +0000 UTC" firstStartedPulling="2025-10-08 13:01:33.572104634 +0000 UTC m=+343.380409188" lastFinishedPulling="2025-10-08 13:01:35.765919734 +0000 UTC m=+345.574224278" observedRunningTime="2025-10-08 13:01:36.614558846 +0000 UTC m=+346.422863390" watchObservedRunningTime="2025-10-08 13:01:36.618610308 +0000 UTC m=+346.426914852" Oct 08 13:01:37 crc kubenswrapper[4735]: I1008 13:01:37.600751 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerID="99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85" exitCode=0 Oct 08 13:01:37 crc kubenswrapper[4735]: I1008 13:01:37.600854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerDied","Data":"99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85"} Oct 08 13:01:38 crc kubenswrapper[4735]: I1008 13:01:38.607574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerStarted","Data":"d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf"} Oct 08 13:01:38 crc kubenswrapper[4735]: I1008 13:01:38.630000 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-md7br" podStartSLOduration=2.180348512 podStartE2EDuration="6.629981157s" podCreationTimestamp="2025-10-08 13:01:32 +0000 UTC" firstStartedPulling="2025-10-08 13:01:33.569822051 +0000 UTC m=+343.378126595" lastFinishedPulling="2025-10-08 13:01:38.019454696 +0000 UTC m=+347.827759240" observedRunningTime="2025-10-08 13:01:38.628781964 +0000 UTC m=+348.437086508" watchObservedRunningTime="2025-10-08 13:01:38.629981157 +0000 UTC m=+348.438285701" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.083582 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.083883 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.122344 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.302250 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.302458 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.340581 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.659599 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:01:40 crc kubenswrapper[4735]: I1008 13:01:40.659663 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rrx2j" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.485026 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.485257 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.524318 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.664590 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kpvmg" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.678251 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.678309 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:42 crc kubenswrapper[4735]: I1008 13:01:42.758752 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:43 crc kubenswrapper[4735]: I1008 13:01:43.673792 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 13:01:54 crc kubenswrapper[4735]: I1008 13:01:54.878774 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:01:54 crc kubenswrapper[4735]: I1008 13:01:54.879302 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:02:24 crc kubenswrapper[4735]: I1008 13:02:24.878601 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:02:24 crc kubenswrapper[4735]: I1008 13:02:24.879133 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:02:54 crc kubenswrapper[4735]: I1008 13:02:54.878128 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:02:54 crc kubenswrapper[4735]: I1008 13:02:54.878806 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:02:54 crc kubenswrapper[4735]: I1008 13:02:54.878873 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:02:54 crc kubenswrapper[4735]: I1008 13:02:54.879761 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a7f255e777881316b10062a37c3969614ae340b33847b5020f89127e16dd753"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:02:54 crc kubenswrapper[4735]: I1008 13:02:54.879845 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://2a7f255e777881316b10062a37c3969614ae340b33847b5020f89127e16dd753" gracePeriod=600 Oct 08 13:02:55 crc kubenswrapper[4735]: I1008 13:02:55.035246 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="2a7f255e777881316b10062a37c3969614ae340b33847b5020f89127e16dd753" exitCode=0 Oct 08 13:02:55 crc kubenswrapper[4735]: I1008 13:02:55.035300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"2a7f255e777881316b10062a37c3969614ae340b33847b5020f89127e16dd753"} Oct 08 13:02:55 crc kubenswrapper[4735]: I1008 13:02:55.035336 4735 scope.go:117] "RemoveContainer" containerID="7fefb46d9936f914f5667f539f69ddaaf85342cc341db62c3eda312aa07a78d9" Oct 08 13:02:56 crc kubenswrapper[4735]: I1008 13:02:56.044731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"173789e0d046afd36eed4db36bb52d30421c87957b91d798bdb18e553a36d54d"} Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.760215 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b7c66"] Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.761353 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.770276 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b7c66"] Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928133 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-registry-tls\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928189 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5628\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-kube-api-access-c5628\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-registry-certificates\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928365 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928414 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-bound-sa-token\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.928441 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-trusted-ca\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:03 crc kubenswrapper[4735]: I1008 13:04:03.960701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.030044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-registry-certificates\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.030269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.030456 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-bound-sa-token\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.030585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-trusted-ca\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.030733 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-registry-tls\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.030879 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5628\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-kube-api-access-c5628\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.031056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.031405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.031494 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-registry-certificates\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.032277 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-trusted-ca\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.042284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-registry-tls\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.042359 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.050227 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5628\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-kube-api-access-c5628\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.050299 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9-bound-sa-token\") pod \"image-registry-66df7c8f76-b7c66\" (UID: \"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.077273 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.258535 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b7c66"] Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.441949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" event={"ID":"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9","Type":"ContainerStarted","Data":"5c2fd75dbf531105fccd0c5b991d6b8a1b921af6ff8658641ba472a0fefaac30"} Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.442257 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.442271 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" event={"ID":"f1c0dd64-8014-4527-b5ad-9f3dcb39c3e9","Type":"ContainerStarted","Data":"189c42ec780d8c3bd9c3ba56d353c297868fff01082f05c9f9b7675ab2ae87e2"} Oct 08 13:04:04 crc kubenswrapper[4735]: I1008 13:04:04.461953 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" podStartSLOduration=1.461935436 podStartE2EDuration="1.461935436s" podCreationTimestamp="2025-10-08 13:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:04:04.460586611 +0000 UTC m=+494.268891185" watchObservedRunningTime="2025-10-08 13:04:04.461935436 +0000 UTC m=+494.270239980" Oct 08 13:04:24 crc kubenswrapper[4735]: I1008 13:04:24.082645 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-b7c66" Oct 08 13:04:24 crc kubenswrapper[4735]: I1008 13:04:24.136116 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgv9c"] Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.176518 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" podUID="b1f96523-d3ab-4768-8de6-c901e255ae74" containerName="registry" containerID="cri-o://6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1" gracePeriod=30 Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.580053 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592697 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-trusted-ca\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592737 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1f96523-d3ab-4768-8de6-c901e255ae74-ca-trust-extracted\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592790 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxmjk\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-kube-api-access-hxmjk\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1f96523-d3ab-4768-8de6-c901e255ae74-installation-pull-secrets\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-tls\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592882 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-certificates\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.592908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-bound-sa-token\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.593659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b1f96523-d3ab-4768-8de6-c901e255ae74\" (UID: \"b1f96523-d3ab-4768-8de6-c901e255ae74\") " Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.593665 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.594201 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.594291 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.602042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.603970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.607933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f96523-d3ab-4768-8de6-c901e255ae74-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.608876 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.609176 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-kube-api-access-hxmjk" (OuterVolumeSpecName: "kube-api-access-hxmjk") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "kube-api-access-hxmjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.619493 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f96523-d3ab-4768-8de6-c901e255ae74-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b1f96523-d3ab-4768-8de6-c901e255ae74" (UID: "b1f96523-d3ab-4768-8de6-c901e255ae74"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.695208 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1f96523-d3ab-4768-8de6-c901e255ae74-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.695244 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.695255 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1f96523-d3ab-4768-8de6-c901e255ae74-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.695263 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.695272 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1f96523-d3ab-4768-8de6-c901e255ae74-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.695281 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxmjk\" (UniqueName: \"kubernetes.io/projected/b1f96523-d3ab-4768-8de6-c901e255ae74-kube-api-access-hxmjk\") on node \"crc\" DevicePath \"\"" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.722689 4735 generic.go:334] "Generic (PLEG): container finished" podID="b1f96523-d3ab-4768-8de6-c901e255ae74" containerID="6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1" exitCode=0 Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.722730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" event={"ID":"b1f96523-d3ab-4768-8de6-c901e255ae74","Type":"ContainerDied","Data":"6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1"} Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.722743 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.722761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vgv9c" event={"ID":"b1f96523-d3ab-4768-8de6-c901e255ae74","Type":"ContainerDied","Data":"247ba771553601d28bf07ceade29021d89b60f3a0f95a6c7fad63d09f5ef1c65"} Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.722781 4735 scope.go:117] "RemoveContainer" containerID="6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.740468 4735 scope.go:117] "RemoveContainer" containerID="6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1" Oct 08 13:04:49 crc kubenswrapper[4735]: E1008 13:04:49.740970 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1\": container with ID starting with 6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1 not found: ID does not exist" containerID="6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.741013 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1"} err="failed to get container status \"6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1\": rpc error: code = NotFound desc = could not find container \"6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1\": container with ID starting with 6e161dbcde96b1affcccd97f99c8979e26548f67f68d6f4c0192fcd02d660bf1 not found: ID does not exist" Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.751555 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgv9c"] Oct 08 13:04:49 crc kubenswrapper[4735]: I1008 13:04:49.752883 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vgv9c"] Oct 08 13:04:50 crc kubenswrapper[4735]: I1008 13:04:50.905095 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f96523-d3ab-4768-8de6-c901e255ae74" path="/var/lib/kubelet/pods/b1f96523-d3ab-4768-8de6-c901e255ae74/volumes" Oct 08 13:05:24 crc kubenswrapper[4735]: I1008 13:05:24.879262 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:05:24 crc kubenswrapper[4735]: I1008 13:05:24.880035 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:05:54 crc kubenswrapper[4735]: I1008 13:05:54.878300 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:05:54 crc kubenswrapper[4735]: I1008 13:05:54.879162 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:06:24 crc kubenswrapper[4735]: I1008 13:06:24.878266 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:06:24 crc kubenswrapper[4735]: I1008 13:06:24.878790 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:06:24 crc kubenswrapper[4735]: I1008 13:06:24.878836 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:06:24 crc kubenswrapper[4735]: I1008 13:06:24.879258 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"173789e0d046afd36eed4db36bb52d30421c87957b91d798bdb18e553a36d54d"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:06:24 crc kubenswrapper[4735]: I1008 13:06:24.879306 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://173789e0d046afd36eed4db36bb52d30421c87957b91d798bdb18e553a36d54d" gracePeriod=600 Oct 08 13:06:25 crc kubenswrapper[4735]: I1008 13:06:25.306974 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="173789e0d046afd36eed4db36bb52d30421c87957b91d798bdb18e553a36d54d" exitCode=0 Oct 08 13:06:25 crc kubenswrapper[4735]: I1008 13:06:25.307066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"173789e0d046afd36eed4db36bb52d30421c87957b91d798bdb18e553a36d54d"} Oct 08 13:06:25 crc kubenswrapper[4735]: I1008 13:06:25.307324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"33eef7770cb538234e6d07e0c65f9624f3726c56e48ea2376d339214ceee54ee"} Oct 08 13:06:25 crc kubenswrapper[4735]: I1008 13:06:25.307347 4735 scope.go:117] "RemoveContainer" containerID="2a7f255e777881316b10062a37c3969614ae340b33847b5020f89127e16dd753" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.446847 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wckcs"] Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.447876 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-controller" containerID="cri-o://cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.448101 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="northd" containerID="cri-o://58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.448253 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.448306 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-node" containerID="cri-o://b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.448349 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-acl-logging" containerID="cri-o://cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.448370 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="sbdb" containerID="cri-o://1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.448434 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="nbdb" containerID="cri-o://492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.481388 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" containerID="cri-o://74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" gracePeriod=30 Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.724996 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 is running failed: container process not found" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.725700 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 is running failed: container process not found" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.726014 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 is running failed: container process not found" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.726096 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.764749 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/3.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.766617 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovn-acl-logging/0.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.767099 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovn-controller/0.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.768183 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.817849 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m42qm"] Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.818268 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.818351 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.818423 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="nbdb" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.818505 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="nbdb" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.818578 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="sbdb" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.818649 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="sbdb" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.818716 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.818775 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.818857 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.818922 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.818989 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819047 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.819109 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819171 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.819316 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f96523-d3ab-4768-8de6-c901e255ae74" containerName="registry" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819384 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f96523-d3ab-4768-8de6-c901e255ae74" containerName="registry" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.819447 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kubecfg-setup" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819544 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kubecfg-setup" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.819614 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-acl-logging" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819679 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-acl-logging" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.819742 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="northd" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819802 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="northd" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.819862 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-node" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.819929 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-node" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820092 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820162 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f96523-d3ab-4768-8de6-c901e255ae74" containerName="registry" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820224 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="northd" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820293 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="sbdb" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820353 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820423 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="nbdb" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820513 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820589 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-node" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820652 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820721 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovn-acl-logging" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820783 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.820852 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.821014 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.821074 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: E1008 13:08:08.821137 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.821231 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.821397 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4653370-cd31-4487-8b63-50b145f55853" containerName="ovnkube-controller" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.823220 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896772 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-env-overrides\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896846 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-kubelet\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896867 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-var-lib-openvswitch\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896893 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-netns\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-node-log\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896937 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896956 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-openvswitch\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896980 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-ovn-kubernetes\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896998 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-ovn\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897017 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-netd\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896979 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.896997 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897038 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-systemd-units\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897055 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-etc-openvswitch\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-slash\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897020 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-node-log" (OuterVolumeSpecName: "node-log") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897073 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897144 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897056 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897101 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hmnq\" (UniqueName: \"kubernetes.io/projected/c4653370-cd31-4487-8b63-50b145f55853-kube-api-access-9hmnq\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897233 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-bin\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897251 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-systemd\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897269 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-log-socket\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-slash" (OuterVolumeSpecName: "host-slash") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897071 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-config\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897095 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897274 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897279 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897407 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-script-lib\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897431 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4653370-cd31-4487-8b63-50b145f55853-ovn-node-metrics-cert\") pod \"c4653370-cd31-4487-8b63-50b145f55853\" (UID: \"c4653370-cd31-4487-8b63-50b145f55853\") " Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897399 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-log-socket" (OuterVolumeSpecName: "log-socket") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.897846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898018 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-slash\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898213 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-kubelet\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-systemd-units\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898432 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-run-netns\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898523 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898543 4735 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898561 4735 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898575 4735 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898590 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898604 4735 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-node-log\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898621 4735 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898636 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898652 4735 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898667 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898682 4735 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898702 4735 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898716 4735 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-slash\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898730 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898752 4735 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-log-socket\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898783 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.898904 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4653370-cd31-4487-8b63-50b145f55853-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.902473 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4653370-cd31-4487-8b63-50b145f55853-kube-api-access-9hmnq" (OuterVolumeSpecName: "kube-api-access-9hmnq") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "kube-api-access-9hmnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.903220 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4653370-cd31-4487-8b63-50b145f55853-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.919137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c4653370-cd31-4487-8b63-50b145f55853" (UID: "c4653370-cd31-4487-8b63-50b145f55853"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.928236 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovnkube-controller/3.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.930726 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovn-acl-logging/0.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.931635 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wckcs_c4653370-cd31-4487-8b63-50b145f55853/ovn-controller/0.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932106 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" exitCode=0 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932217 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" exitCode=0 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932282 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" exitCode=0 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932370 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" exitCode=0 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932458 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" exitCode=0 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932569 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" exitCode=0 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932658 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" exitCode=143 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932763 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4653370-cd31-4487-8b63-50b145f55853" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" exitCode=143 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932195 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.932159 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933130 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933198 4735 scope.go:117] "RemoveContainer" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933398 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933464 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933554 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933619 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933672 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933720 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933767 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933817 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933862 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933917 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.933969 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934024 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934083 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934136 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934192 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934243 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934290 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934342 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934391 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934442 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934526 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934586 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934640 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934694 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934748 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934800 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934853 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.934957 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935008 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935070 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935130 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935198 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935253 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wckcs" event={"ID":"c4653370-cd31-4487-8b63-50b145f55853","Type":"ContainerDied","Data":"65557cc3370cfa21b560bb9b626c7859da246de7ce6ca40443e1098741e4ca9b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935435 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935509 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935570 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935622 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935667 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935709 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935757 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935805 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.935954 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.936009 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.936184 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/2.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.936918 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/1.log" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.937004 4735 generic.go:334] "Generic (PLEG): container finished" podID="ef6f0aff-a0e5-477a-9392-71bc49bfa7aa" containerID="a254857e1ec70adc6c42fdc42cf5c0c5398fcb735bcc4165db84ada832b4d6fc" exitCode=2 Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.937067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerDied","Data":"a254857e1ec70adc6c42fdc42cf5c0c5398fcb735bcc4165db84ada832b4d6fc"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.937109 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14"} Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.937956 4735 scope.go:117] "RemoveContainer" containerID="a254857e1ec70adc6c42fdc42cf5c0c5398fcb735bcc4165db84ada832b4d6fc" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.979063 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wckcs"] Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.979253 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 13:08:08 crc kubenswrapper[4735]: I1008 13:08:08.983416 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wckcs"] Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000643 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-etc-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-slash\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000759 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-slash\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-kubelet\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-node-log\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-systemd-units\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovn-node-metrics-cert\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-env-overrides\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.000996 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-log-socket\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-cni-bin\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001064 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj999\" (UniqueName: \"kubernetes.io/projected/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-kube-api-access-dj999\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001136 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-var-lib-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-run-netns\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001209 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-systemd\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovnkube-script-lib\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001292 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-cni-netd\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovnkube-config\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-ovn\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001398 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hmnq\" (UniqueName: \"kubernetes.io/projected/c4653370-cd31-4487-8b63-50b145f55853-kube-api-access-9hmnq\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001414 4735 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4653370-cd31-4487-8b63-50b145f55853-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001430 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4653370-cd31-4487-8b63-50b145f55853-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001504 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-kubelet\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-systemd-units\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.001676 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-run-netns\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.012822 4735 scope.go:117] "RemoveContainer" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.025352 4735 scope.go:117] "RemoveContainer" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.040528 4735 scope.go:117] "RemoveContainer" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.057932 4735 scope.go:117] "RemoveContainer" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.072185 4735 scope.go:117] "RemoveContainer" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.087529 4735 scope.go:117] "RemoveContainer" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.099694 4735 scope.go:117] "RemoveContainer" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102639 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-log-socket\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-cni-bin\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102749 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj999\" (UniqueName: \"kubernetes.io/projected/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-kube-api-access-dj999\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102760 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102784 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-var-lib-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102827 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-var-lib-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102841 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-log-socket\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-systemd\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-cni-bin\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102919 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102974 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-cni-netd\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.102991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovnkube-script-lib\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103034 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovnkube-config\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-ovn\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-etc-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-node-log\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovn-node-metrics-cert\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-env-overrides\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103300 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103331 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-etc-openvswitch\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-node-log\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103433 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-cni-netd\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103305 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-systemd\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103591 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-run-ovn\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.103683 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-env-overrides\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.105555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovnkube-config\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.106718 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovnkube-script-lib\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.109355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-ovn-node-metrics-cert\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.112646 4735 scope.go:117] "RemoveContainer" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.124689 4735 scope.go:117] "RemoveContainer" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.125078 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": container with ID starting with 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 not found: ID does not exist" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.125174 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj999\" (UniqueName: \"kubernetes.io/projected/d9c633e5-ed09-4ded-91e4-496da6dd2b5e-kube-api-access-dj999\") pod \"ovnkube-node-m42qm\" (UID: \"d9c633e5-ed09-4ded-91e4-496da6dd2b5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.125255 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} err="failed to get container status \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": rpc error: code = NotFound desc = could not find container \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": container with ID starting with 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.125334 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.125718 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": container with ID starting with 173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f not found: ID does not exist" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.125763 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} err="failed to get container status \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": rpc error: code = NotFound desc = could not find container \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": container with ID starting with 173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.125789 4735 scope.go:117] "RemoveContainer" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.126022 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": container with ID starting with 1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546 not found: ID does not exist" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126044 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} err="failed to get container status \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": rpc error: code = NotFound desc = could not find container \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": container with ID starting with 1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126058 4735 scope.go:117] "RemoveContainer" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.126232 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": container with ID starting with 492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b not found: ID does not exist" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126257 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} err="failed to get container status \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": rpc error: code = NotFound desc = could not find container \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": container with ID starting with 492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126276 4735 scope.go:117] "RemoveContainer" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.126436 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": container with ID starting with 58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4 not found: ID does not exist" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126454 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} err="failed to get container status \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": rpc error: code = NotFound desc = could not find container \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": container with ID starting with 58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126464 4735 scope.go:117] "RemoveContainer" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.126654 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": container with ID starting with f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3 not found: ID does not exist" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126731 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} err="failed to get container status \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": rpc error: code = NotFound desc = could not find container \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": container with ID starting with f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.126795 4735 scope.go:117] "RemoveContainer" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.127058 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": container with ID starting with b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42 not found: ID does not exist" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127088 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} err="failed to get container status \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": rpc error: code = NotFound desc = could not find container \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": container with ID starting with b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127109 4735 scope.go:117] "RemoveContainer" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.127282 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": container with ID starting with cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b not found: ID does not exist" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127302 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} err="failed to get container status \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": rpc error: code = NotFound desc = could not find container \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": container with ID starting with cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127315 4735 scope.go:117] "RemoveContainer" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.127476 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": container with ID starting with cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2 not found: ID does not exist" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127505 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} err="failed to get container status \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": rpc error: code = NotFound desc = could not find container \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": container with ID starting with cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127518 4735 scope.go:117] "RemoveContainer" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" Oct 08 13:08:09 crc kubenswrapper[4735]: E1008 13:08:09.127665 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": container with ID starting with 5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2 not found: ID does not exist" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127684 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} err="failed to get container status \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": rpc error: code = NotFound desc = could not find container \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": container with ID starting with 5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127696 4735 scope.go:117] "RemoveContainer" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127863 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} err="failed to get container status \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": rpc error: code = NotFound desc = could not find container \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": container with ID starting with 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.127878 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128049 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} err="failed to get container status \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": rpc error: code = NotFound desc = could not find container \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": container with ID starting with 173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128072 4735 scope.go:117] "RemoveContainer" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128243 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} err="failed to get container status \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": rpc error: code = NotFound desc = could not find container \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": container with ID starting with 1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128321 4735 scope.go:117] "RemoveContainer" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128606 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} err="failed to get container status \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": rpc error: code = NotFound desc = could not find container \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": container with ID starting with 492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128627 4735 scope.go:117] "RemoveContainer" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128782 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} err="failed to get container status \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": rpc error: code = NotFound desc = could not find container \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": container with ID starting with 58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128798 4735 scope.go:117] "RemoveContainer" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128941 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} err="failed to get container status \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": rpc error: code = NotFound desc = could not find container \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": container with ID starting with f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.128957 4735 scope.go:117] "RemoveContainer" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129164 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} err="failed to get container status \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": rpc error: code = NotFound desc = could not find container \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": container with ID starting with b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129192 4735 scope.go:117] "RemoveContainer" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129397 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} err="failed to get container status \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": rpc error: code = NotFound desc = could not find container \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": container with ID starting with cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129412 4735 scope.go:117] "RemoveContainer" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129602 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} err="failed to get container status \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": rpc error: code = NotFound desc = could not find container \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": container with ID starting with cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129615 4735 scope.go:117] "RemoveContainer" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129750 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} err="failed to get container status \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": rpc error: code = NotFound desc = could not find container \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": container with ID starting with 5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129762 4735 scope.go:117] "RemoveContainer" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129903 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} err="failed to get container status \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": rpc error: code = NotFound desc = could not find container \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": container with ID starting with 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.129930 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130109 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} err="failed to get container status \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": rpc error: code = NotFound desc = could not find container \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": container with ID starting with 173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130131 4735 scope.go:117] "RemoveContainer" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130392 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} err="failed to get container status \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": rpc error: code = NotFound desc = could not find container \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": container with ID starting with 1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130410 4735 scope.go:117] "RemoveContainer" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130570 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} err="failed to get container status \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": rpc error: code = NotFound desc = could not find container \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": container with ID starting with 492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130586 4735 scope.go:117] "RemoveContainer" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130718 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} err="failed to get container status \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": rpc error: code = NotFound desc = could not find container \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": container with ID starting with 58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130737 4735 scope.go:117] "RemoveContainer" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130918 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} err="failed to get container status \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": rpc error: code = NotFound desc = could not find container \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": container with ID starting with f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.130937 4735 scope.go:117] "RemoveContainer" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131131 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} err="failed to get container status \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": rpc error: code = NotFound desc = could not find container \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": container with ID starting with b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131147 4735 scope.go:117] "RemoveContainer" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131319 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} err="failed to get container status \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": rpc error: code = NotFound desc = could not find container \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": container with ID starting with cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131337 4735 scope.go:117] "RemoveContainer" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131613 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} err="failed to get container status \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": rpc error: code = NotFound desc = could not find container \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": container with ID starting with cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131631 4735 scope.go:117] "RemoveContainer" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.131909 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} err="failed to get container status \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": rpc error: code = NotFound desc = could not find container \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": container with ID starting with 5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.132017 4735 scope.go:117] "RemoveContainer" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.132408 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} err="failed to get container status \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": rpc error: code = NotFound desc = could not find container \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": container with ID starting with 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.132537 4735 scope.go:117] "RemoveContainer" containerID="173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.132813 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f"} err="failed to get container status \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": rpc error: code = NotFound desc = could not find container \"173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f\": container with ID starting with 173883d0ef0ca36b6f683421d0020cbbd0fcf4aaf08f17c53427387f9ed39d5f not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.132888 4735 scope.go:117] "RemoveContainer" containerID="1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.133207 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546"} err="failed to get container status \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": rpc error: code = NotFound desc = could not find container \"1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546\": container with ID starting with 1758c6a3e050fee720b9cae06332bc1d34000d2a61aa27470c483dc8757a5546 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.133231 4735 scope.go:117] "RemoveContainer" containerID="492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.133444 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b"} err="failed to get container status \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": rpc error: code = NotFound desc = could not find container \"492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b\": container with ID starting with 492da61345c2220eef06ac2e1979e5402ebb8deb0e3747bb3620ecccd4e83a3b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.133463 4735 scope.go:117] "RemoveContainer" containerID="58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.134740 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4"} err="failed to get container status \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": rpc error: code = NotFound desc = could not find container \"58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4\": container with ID starting with 58cc5c76f1748baef586618dd49d3392757ef32e4f28a019979f84366e8879c4 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.134771 4735 scope.go:117] "RemoveContainer" containerID="f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135012 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3"} err="failed to get container status \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": rpc error: code = NotFound desc = could not find container \"f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3\": container with ID starting with f1fc9ee7ab9490b5462704528e6923ee420e0c9f3c0ed22e7041fedc8dfdfeb3 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135033 4735 scope.go:117] "RemoveContainer" containerID="b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135307 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42"} err="failed to get container status \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": rpc error: code = NotFound desc = could not find container \"b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42\": container with ID starting with b6da8f9945b0943287cecc7782253cd88835d3a5a823ae5e2107423b9bdc4e42 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135329 4735 scope.go:117] "RemoveContainer" containerID="cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135533 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b"} err="failed to get container status \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": rpc error: code = NotFound desc = could not find container \"cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b\": container with ID starting with cb1bc51dca9b3bf6c136fe146aa422fa40eb512385617607f682723053988f7b not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135560 4735 scope.go:117] "RemoveContainer" containerID="cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135787 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135905 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2"} err="failed to get container status \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": rpc error: code = NotFound desc = could not find container \"cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2\": container with ID starting with cc88f8b567ef0f69f9df25a237c40b32bbd518680da75fe3cabd546713423bf2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.135937 4735 scope.go:117] "RemoveContainer" containerID="5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.136134 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2"} err="failed to get container status \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": rpc error: code = NotFound desc = could not find container \"5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2\": container with ID starting with 5e39f9fb2f655437c3aeeb56317d4b6723c8b2eeb9a02584fc688545c2a0d7d2 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.136160 4735 scope.go:117] "RemoveContainer" containerID="74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.138745 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8"} err="failed to get container status \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": rpc error: code = NotFound desc = could not find container \"74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8\": container with ID starting with 74742f2f31c02712067b32f11d523e62c5db7d107e1c74fc1df5c218f22d92f8 not found: ID does not exist" Oct 08 13:08:09 crc kubenswrapper[4735]: W1008 13:08:09.149831 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c633e5_ed09_4ded_91e4_496da6dd2b5e.slice/crio-53bb2f35ff7f537918e9f802b2595ccaa6a5d3862529861cd384d6c8114bfa2a WatchSource:0}: Error finding container 53bb2f35ff7f537918e9f802b2595ccaa6a5d3862529861cd384d6c8114bfa2a: Status 404 returned error can't find the container with id 53bb2f35ff7f537918e9f802b2595ccaa6a5d3862529861cd384d6c8114bfa2a Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.947340 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/2.log" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.949998 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/1.log" Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.950086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tfdqx" event={"ID":"ef6f0aff-a0e5-477a-9392-71bc49bfa7aa","Type":"ContainerStarted","Data":"0cd5f638e98ce03ad8525f8b29e08167ebcf05d5aa5e4b1acf2306af519c1383"} Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.952847 4735 generic.go:334] "Generic (PLEG): container finished" podID="d9c633e5-ed09-4ded-91e4-496da6dd2b5e" containerID="fa34493e05a8155b15aeede8bc63d3eb1612ee52073d093205626e106a61c730" exitCode=0 Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.952914 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerDied","Data":"fa34493e05a8155b15aeede8bc63d3eb1612ee52073d093205626e106a61c730"} Oct 08 13:08:09 crc kubenswrapper[4735]: I1008 13:08:09.953005 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"53bb2f35ff7f537918e9f802b2595ccaa6a5d3862529861cd384d6c8114bfa2a"} Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.908313 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4653370-cd31-4487-8b63-50b145f55853" path="/var/lib/kubelet/pods/c4653370-cd31-4487-8b63-50b145f55853/volumes" Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.962414 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"7c1877cb5d7f240498959b7bf2c585adf49c95aad7d350d0610e3b7d7fdf2557"} Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.962471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"9fbbcdb16582cbec08e1f5de05eef29225878171059f46eb3140d25a5af8ee6a"} Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.962516 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"2ff0c8a9e7c188c25e4cf97c05c30b471a82ec3a88c82e43ad3195cb00d4ce33"} Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.962527 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"6f199e67d1ea040513a621ea60d7568d667a4a0b00e30dcd574a84db56fbef14"} Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.962542 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"07684c4974bdcfa5ae4cdee3c55b84d12592ae9ad42bf32fcdc26323df39b6d1"} Oct 08 13:08:10 crc kubenswrapper[4735]: I1008 13:08:10.962551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"9764079e0911f62f754cf5ec12e895079e948945d1c60f8b85e2db7e83fb0a95"} Oct 08 13:08:12 crc kubenswrapper[4735]: I1008 13:08:12.975379 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"19f83dd122db62c1836848f48d9e183dfb88a7c47571242ca367de373faeecc6"} Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.592380 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tjn9c"] Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.593177 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.595466 4735 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dllms" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.596171 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.596566 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.596614 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.656782 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c1a4018-c324-47f4-b8af-f23613c1257e-crc-storage\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.656855 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zkz\" (UniqueName: \"kubernetes.io/projected/6c1a4018-c324-47f4-b8af-f23613c1257e-kube-api-access-d4zkz\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.656897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c1a4018-c324-47f4-b8af-f23613c1257e-node-mnt\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.758102 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c1a4018-c324-47f4-b8af-f23613c1257e-crc-storage\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.758221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zkz\" (UniqueName: \"kubernetes.io/projected/6c1a4018-c324-47f4-b8af-f23613c1257e-kube-api-access-d4zkz\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.758312 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c1a4018-c324-47f4-b8af-f23613c1257e-node-mnt\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.758714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c1a4018-c324-47f4-b8af-f23613c1257e-node-mnt\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.759260 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c1a4018-c324-47f4-b8af-f23613c1257e-crc-storage\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.778726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zkz\" (UniqueName: \"kubernetes.io/projected/6c1a4018-c324-47f4-b8af-f23613c1257e-kube-api-access-d4zkz\") pod \"crc-storage-crc-tjn9c\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: I1008 13:08:13.910410 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: E1008 13:08:13.935378 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(55fd6b977168d1d41ef1394c1c723561c0b2296ff76bb74e929d3122535cd9ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 13:08:13 crc kubenswrapper[4735]: E1008 13:08:13.935533 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(55fd6b977168d1d41ef1394c1c723561c0b2296ff76bb74e929d3122535cd9ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: E1008 13:08:13.935590 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(55fd6b977168d1d41ef1394c1c723561c0b2296ff76bb74e929d3122535cd9ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:13 crc kubenswrapper[4735]: E1008 13:08:13.935680 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-tjn9c_crc-storage(6c1a4018-c324-47f4-b8af-f23613c1257e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-tjn9c_crc-storage(6c1a4018-c324-47f4-b8af-f23613c1257e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(55fd6b977168d1d41ef1394c1c723561c0b2296ff76bb74e929d3122535cd9ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-tjn9c" podUID="6c1a4018-c324-47f4-b8af-f23613c1257e" Oct 08 13:08:15 crc kubenswrapper[4735]: I1008 13:08:15.814203 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tjn9c"] Oct 08 13:08:15 crc kubenswrapper[4735]: I1008 13:08:15.814926 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:15 crc kubenswrapper[4735]: I1008 13:08:15.815385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:15 crc kubenswrapper[4735]: E1008 13:08:15.845807 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(6b96f7eca0c6a7bdfcb701e75b5629d4268d3466f6bdc9b000a1a2f2c16acd4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 13:08:15 crc kubenswrapper[4735]: E1008 13:08:15.845912 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(6b96f7eca0c6a7bdfcb701e75b5629d4268d3466f6bdc9b000a1a2f2c16acd4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:15 crc kubenswrapper[4735]: E1008 13:08:15.845948 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(6b96f7eca0c6a7bdfcb701e75b5629d4268d3466f6bdc9b000a1a2f2c16acd4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:15 crc kubenswrapper[4735]: E1008 13:08:15.846009 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-tjn9c_crc-storage(6c1a4018-c324-47f4-b8af-f23613c1257e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-tjn9c_crc-storage(6c1a4018-c324-47f4-b8af-f23613c1257e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-tjn9c_crc-storage_6c1a4018-c324-47f4-b8af-f23613c1257e_0(6b96f7eca0c6a7bdfcb701e75b5629d4268d3466f6bdc9b000a1a2f2c16acd4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-tjn9c" podUID="6c1a4018-c324-47f4-b8af-f23613c1257e" Oct 08 13:08:15 crc kubenswrapper[4735]: I1008 13:08:15.995220 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" event={"ID":"d9c633e5-ed09-4ded-91e4-496da6dd2b5e","Type":"ContainerStarted","Data":"3db3329d2a1840d2dc59798b4b2f6df076c3920eb6cce2e082b1e762348b640c"} Oct 08 13:08:15 crc kubenswrapper[4735]: I1008 13:08:15.995500 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:15 crc kubenswrapper[4735]: I1008 13:08:15.995525 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:16 crc kubenswrapper[4735]: I1008 13:08:16.028114 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:16 crc kubenswrapper[4735]: I1008 13:08:16.059142 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" podStartSLOduration=8.05912257 podStartE2EDuration="8.05912257s" podCreationTimestamp="2025-10-08 13:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:08:16.027816549 +0000 UTC m=+745.836121093" watchObservedRunningTime="2025-10-08 13:08:16.05912257 +0000 UTC m=+745.867427114" Oct 08 13:08:16 crc kubenswrapper[4735]: I1008 13:08:16.998986 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:17 crc kubenswrapper[4735]: I1008 13:08:17.023219 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.042508 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rqp7"] Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.042957 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" containerName="controller-manager" containerID="cri-o://f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff" gracePeriod=30 Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.177441 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4"] Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.177691 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" podUID="49e6e79f-93b3-453f-83d1-2afc40a4637c" containerName="route-controller-manager" containerID="cri-o://f7568c93b35a26c2547d77c4962c3ad18071544a35b177809be0f73830ea8077" gracePeriod=30 Oct 08 13:08:19 crc kubenswrapper[4735]: E1008 13:08:19.275171 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15a08015_bcc0_4576_8f54_70e5693469a4.slice/crio-conmon-f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49e6e79f_93b3_453f_83d1_2afc40a4637c.slice/crio-f7568c93b35a26c2547d77c4962c3ad18071544a35b177809be0f73830ea8077.scope\": RecentStats: unable to find data in memory cache]" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.584645 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.733745 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-proxy-ca-bundles\") pod \"15a08015-bcc0-4576-8f54-70e5693469a4\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.733811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf88x\" (UniqueName: \"kubernetes.io/projected/15a08015-bcc0-4576-8f54-70e5693469a4-kube-api-access-cf88x\") pod \"15a08015-bcc0-4576-8f54-70e5693469a4\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.733846 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-config\") pod \"15a08015-bcc0-4576-8f54-70e5693469a4\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.733862 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-client-ca\") pod \"15a08015-bcc0-4576-8f54-70e5693469a4\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.733894 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15a08015-bcc0-4576-8f54-70e5693469a4-serving-cert\") pod \"15a08015-bcc0-4576-8f54-70e5693469a4\" (UID: \"15a08015-bcc0-4576-8f54-70e5693469a4\") " Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.735690 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "15a08015-bcc0-4576-8f54-70e5693469a4" (UID: "15a08015-bcc0-4576-8f54-70e5693469a4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.735722 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-config" (OuterVolumeSpecName: "config") pod "15a08015-bcc0-4576-8f54-70e5693469a4" (UID: "15a08015-bcc0-4576-8f54-70e5693469a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.735772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-client-ca" (OuterVolumeSpecName: "client-ca") pod "15a08015-bcc0-4576-8f54-70e5693469a4" (UID: "15a08015-bcc0-4576-8f54-70e5693469a4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.742072 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a08015-bcc0-4576-8f54-70e5693469a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "15a08015-bcc0-4576-8f54-70e5693469a4" (UID: "15a08015-bcc0-4576-8f54-70e5693469a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.743234 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a08015-bcc0-4576-8f54-70e5693469a4-kube-api-access-cf88x" (OuterVolumeSpecName: "kube-api-access-cf88x") pod "15a08015-bcc0-4576-8f54-70e5693469a4" (UID: "15a08015-bcc0-4576-8f54-70e5693469a4"). InnerVolumeSpecName "kube-api-access-cf88x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.835695 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf88x\" (UniqueName: \"kubernetes.io/projected/15a08015-bcc0-4576-8f54-70e5693469a4-kube-api-access-cf88x\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.836010 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.836088 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.836152 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15a08015-bcc0-4576-8f54-70e5693469a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:19 crc kubenswrapper[4735]: I1008 13:08:19.836226 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15a08015-bcc0-4576-8f54-70e5693469a4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.028248 4735 generic.go:334] "Generic (PLEG): container finished" podID="49e6e79f-93b3-453f-83d1-2afc40a4637c" containerID="f7568c93b35a26c2547d77c4962c3ad18071544a35b177809be0f73830ea8077" exitCode=0 Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.028354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" event={"ID":"49e6e79f-93b3-453f-83d1-2afc40a4637c","Type":"ContainerDied","Data":"f7568c93b35a26c2547d77c4962c3ad18071544a35b177809be0f73830ea8077"} Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.035254 4735 generic.go:334] "Generic (PLEG): container finished" podID="15a08015-bcc0-4576-8f54-70e5693469a4" containerID="f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff" exitCode=0 Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.035292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" event={"ID":"15a08015-bcc0-4576-8f54-70e5693469a4","Type":"ContainerDied","Data":"f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff"} Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.035317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" event={"ID":"15a08015-bcc0-4576-8f54-70e5693469a4","Type":"ContainerDied","Data":"23d4121b6bd6638ba3fe161ed79d7545652bbec07e722d15a16b1b8fb4f83c37"} Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.035334 4735 scope.go:117] "RemoveContainer" containerID="f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.035450 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8rqp7" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.056729 4735 scope.go:117] "RemoveContainer" containerID="f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff" Oct 08 13:08:20 crc kubenswrapper[4735]: E1008 13:08:20.057913 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff\": container with ID starting with f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff not found: ID does not exist" containerID="f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.057956 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff"} err="failed to get container status \"f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff\": rpc error: code = NotFound desc = could not find container \"f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff\": container with ID starting with f05ac887033a6c69e63fce087e28341a3d6f244d07560798ac75f2a609a63eff not found: ID does not exist" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.068606 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rqp7"] Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.070912 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8rqp7"] Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.104393 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.240541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-client-ca\") pod \"49e6e79f-93b3-453f-83d1-2afc40a4637c\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.240652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-config\") pod \"49e6e79f-93b3-453f-83d1-2afc40a4637c\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.240685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e6e79f-93b3-453f-83d1-2afc40a4637c-serving-cert\") pod \"49e6e79f-93b3-453f-83d1-2afc40a4637c\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.240735 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spz7k\" (UniqueName: \"kubernetes.io/projected/49e6e79f-93b3-453f-83d1-2afc40a4637c-kube-api-access-spz7k\") pod \"49e6e79f-93b3-453f-83d1-2afc40a4637c\" (UID: \"49e6e79f-93b3-453f-83d1-2afc40a4637c\") " Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.241255 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-client-ca" (OuterVolumeSpecName: "client-ca") pod "49e6e79f-93b3-453f-83d1-2afc40a4637c" (UID: "49e6e79f-93b3-453f-83d1-2afc40a4637c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.241493 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-config" (OuterVolumeSpecName: "config") pod "49e6e79f-93b3-453f-83d1-2afc40a4637c" (UID: "49e6e79f-93b3-453f-83d1-2afc40a4637c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.244239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e6e79f-93b3-453f-83d1-2afc40a4637c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49e6e79f-93b3-453f-83d1-2afc40a4637c" (UID: "49e6e79f-93b3-453f-83d1-2afc40a4637c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.244265 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e6e79f-93b3-453f-83d1-2afc40a4637c-kube-api-access-spz7k" (OuterVolumeSpecName: "kube-api-access-spz7k") pod "49e6e79f-93b3-453f-83d1-2afc40a4637c" (UID: "49e6e79f-93b3-453f-83d1-2afc40a4637c"). InnerVolumeSpecName "kube-api-access-spz7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.342411 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spz7k\" (UniqueName: \"kubernetes.io/projected/49e6e79f-93b3-453f-83d1-2afc40a4637c-kube-api-access-spz7k\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.342742 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.342753 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e6e79f-93b3-453f-83d1-2afc40a4637c-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.342763 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e6e79f-93b3-453f-83d1-2afc40a4637c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.449285 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv"] Oct 08 13:08:20 crc kubenswrapper[4735]: E1008 13:08:20.449560 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" containerName="controller-manager" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.449577 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" containerName="controller-manager" Oct 08 13:08:20 crc kubenswrapper[4735]: E1008 13:08:20.449591 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e6e79f-93b3-453f-83d1-2afc40a4637c" containerName="route-controller-manager" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.449602 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e6e79f-93b3-453f-83d1-2afc40a4637c" containerName="route-controller-manager" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.449729 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e6e79f-93b3-453f-83d1-2afc40a4637c" containerName="route-controller-manager" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.449745 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" containerName="controller-manager" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.450106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.454283 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz"] Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.454915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.457085 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.457490 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.457688 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.457717 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.457726 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.460909 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.465061 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz"] Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.468098 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.468180 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv"] Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.545601 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80b9e44c-6775-4226-abdf-c19cd3d6ef28-serving-cert\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.545684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k26wh\" (UniqueName: \"kubernetes.io/projected/80b9e44c-6775-4226-abdf-c19cd3d6ef28-kube-api-access-k26wh\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.545834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b9e44c-6775-4226-abdf-c19cd3d6ef28-config\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.545928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80b9e44c-6775-4226-abdf-c19cd3d6ef28-client-ca\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80b9e44c-6775-4226-abdf-c19cd3d6ef28-serving-cert\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-proxy-ca-bundles\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k26wh\" (UniqueName: \"kubernetes.io/projected/80b9e44c-6775-4226-abdf-c19cd3d6ef28-kube-api-access-k26wh\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-serving-cert\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjf6d\" (UniqueName: \"kubernetes.io/projected/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-kube-api-access-tjf6d\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-config\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647810 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b9e44c-6775-4226-abdf-c19cd3d6ef28-config\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-client-ca\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.647865 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80b9e44c-6775-4226-abdf-c19cd3d6ef28-client-ca\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.648855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80b9e44c-6775-4226-abdf-c19cd3d6ef28-client-ca\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.649350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b9e44c-6775-4226-abdf-c19cd3d6ef28-config\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.650884 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80b9e44c-6775-4226-abdf-c19cd3d6ef28-serving-cert\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.666957 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k26wh\" (UniqueName: \"kubernetes.io/projected/80b9e44c-6775-4226-abdf-c19cd3d6ef28-kube-api-access-k26wh\") pod \"route-controller-manager-58cb56ffbd-qppvv\" (UID: \"80b9e44c-6775-4226-abdf-c19cd3d6ef28\") " pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.748723 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-proxy-ca-bundles\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.748776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-serving-cert\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.748803 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjf6d\" (UniqueName: \"kubernetes.io/projected/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-kube-api-access-tjf6d\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.748844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-config\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.748872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-client-ca\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.749784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-client-ca\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.750669 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-config\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.751183 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-proxy-ca-bundles\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.752524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-serving-cert\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.765442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjf6d\" (UniqueName: \"kubernetes.io/projected/0189b6bb-e01d-4c9b-95e4-82e9e64f55f4-kube-api-access-tjf6d\") pod \"controller-manager-5fdfb999c4-2pvlz\" (UID: \"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4\") " pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.771412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.781364 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:20 crc kubenswrapper[4735]: I1008 13:08:20.903816 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a08015-bcc0-4576-8f54-70e5693469a4" path="/var/lib/kubelet/pods/15a08015-bcc0-4576-8f54-70e5693469a4/volumes" Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.042172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" event={"ID":"49e6e79f-93b3-453f-83d1-2afc40a4637c","Type":"ContainerDied","Data":"c416ddbd68459bea0c1a5952177059ec088806ee43e6f1eefa8bd94d8e1c04f3"} Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.042207 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4" Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.042636 4735 scope.go:117] "RemoveContainer" containerID="f7568c93b35a26c2547d77c4962c3ad18071544a35b177809be0f73830ea8077" Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.061880 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4"] Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.066331 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rgtx4"] Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.163262 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv"] Oct 08 13:08:21 crc kubenswrapper[4735]: I1008 13:08:21.167833 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz"] Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.050351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" event={"ID":"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4","Type":"ContainerStarted","Data":"a087fca1b2ca22098cb1f07b97bc23fd59df3042e6e54512aba41565c25521f4"} Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.051164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.051277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" event={"ID":"0189b6bb-e01d-4c9b-95e4-82e9e64f55f4","Type":"ContainerStarted","Data":"90f016e29023051b896b8b54464123353e7c7d1cb169cca9ef610dc511ef6caa"} Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.053022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" event={"ID":"80b9e44c-6775-4226-abdf-c19cd3d6ef28","Type":"ContainerStarted","Data":"b889ce1a5e88771217cc40e277d94f2c58a12c19e93ee8f03bdbe588b29b759e"} Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.053051 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" event={"ID":"80b9e44c-6775-4226-abdf-c19cd3d6ef28","Type":"ContainerStarted","Data":"93891717ee4ca9cf3236e8e50edb5c6f8e3ac761e674ade45d2c8b11bc6ec980"} Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.053558 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.057126 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.057953 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.067574 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fdfb999c4-2pvlz" podStartSLOduration=3.067558788 podStartE2EDuration="3.067558788s" podCreationTimestamp="2025-10-08 13:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:08:22.065820771 +0000 UTC m=+751.874125315" watchObservedRunningTime="2025-10-08 13:08:22.067558788 +0000 UTC m=+751.875863332" Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.097630 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58cb56ffbd-qppvv" podStartSLOduration=3.097610146 podStartE2EDuration="3.097610146s" podCreationTimestamp="2025-10-08 13:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:08:22.095071107 +0000 UTC m=+751.903375651" watchObservedRunningTime="2025-10-08 13:08:22.097610146 +0000 UTC m=+751.905914680" Oct 08 13:08:22 crc kubenswrapper[4735]: I1008 13:08:22.901381 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e6e79f-93b3-453f-83d1-2afc40a4637c" path="/var/lib/kubelet/pods/49e6e79f-93b3-453f-83d1-2afc40a4637c/volumes" Oct 08 13:08:25 crc kubenswrapper[4735]: I1008 13:08:25.484673 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 13:08:26 crc kubenswrapper[4735]: I1008 13:08:26.894110 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:26 crc kubenswrapper[4735]: I1008 13:08:26.894436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:27 crc kubenswrapper[4735]: I1008 13:08:27.289352 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tjn9c"] Oct 08 13:08:27 crc kubenswrapper[4735]: W1008 13:08:27.298649 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1a4018_c324_47f4_b8af_f23613c1257e.slice/crio-187ff429438c985aa771e6a7fa022fe47b1897911533071f2c6d91155760e4fd WatchSource:0}: Error finding container 187ff429438c985aa771e6a7fa022fe47b1897911533071f2c6d91155760e4fd: Status 404 returned error can't find the container with id 187ff429438c985aa771e6a7fa022fe47b1897911533071f2c6d91155760e4fd Oct 08 13:08:27 crc kubenswrapper[4735]: I1008 13:08:27.301074 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:08:28 crc kubenswrapper[4735]: I1008 13:08:28.084702 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tjn9c" event={"ID":"6c1a4018-c324-47f4-b8af-f23613c1257e","Type":"ContainerStarted","Data":"187ff429438c985aa771e6a7fa022fe47b1897911533071f2c6d91155760e4fd"} Oct 08 13:08:29 crc kubenswrapper[4735]: I1008 13:08:29.094307 4735 generic.go:334] "Generic (PLEG): container finished" podID="6c1a4018-c324-47f4-b8af-f23613c1257e" containerID="bfbbfc5ac340ef793cb286610eff933f0fa63c6bcc8b318d43d5bd49c977ea59" exitCode=0 Oct 08 13:08:29 crc kubenswrapper[4735]: I1008 13:08:29.094610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tjn9c" event={"ID":"6c1a4018-c324-47f4-b8af-f23613c1257e","Type":"ContainerDied","Data":"bfbbfc5ac340ef793cb286610eff933f0fa63c6bcc8b318d43d5bd49c977ea59"} Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.460781 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.566122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c1a4018-c324-47f4-b8af-f23613c1257e-crc-storage\") pod \"6c1a4018-c324-47f4-b8af-f23613c1257e\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.566203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zkz\" (UniqueName: \"kubernetes.io/projected/6c1a4018-c324-47f4-b8af-f23613c1257e-kube-api-access-d4zkz\") pod \"6c1a4018-c324-47f4-b8af-f23613c1257e\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.566250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c1a4018-c324-47f4-b8af-f23613c1257e-node-mnt\") pod \"6c1a4018-c324-47f4-b8af-f23613c1257e\" (UID: \"6c1a4018-c324-47f4-b8af-f23613c1257e\") " Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.566470 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c1a4018-c324-47f4-b8af-f23613c1257e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6c1a4018-c324-47f4-b8af-f23613c1257e" (UID: "6c1a4018-c324-47f4-b8af-f23613c1257e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.572118 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1a4018-c324-47f4-b8af-f23613c1257e-kube-api-access-d4zkz" (OuterVolumeSpecName: "kube-api-access-d4zkz") pod "6c1a4018-c324-47f4-b8af-f23613c1257e" (UID: "6c1a4018-c324-47f4-b8af-f23613c1257e"). InnerVolumeSpecName "kube-api-access-d4zkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.594003 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1a4018-c324-47f4-b8af-f23613c1257e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6c1a4018-c324-47f4-b8af-f23613c1257e" (UID: "6c1a4018-c324-47f4-b8af-f23613c1257e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.667731 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zkz\" (UniqueName: \"kubernetes.io/projected/6c1a4018-c324-47f4-b8af-f23613c1257e-kube-api-access-d4zkz\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.667762 4735 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c1a4018-c324-47f4-b8af-f23613c1257e-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:30 crc kubenswrapper[4735]: I1008 13:08:30.667771 4735 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c1a4018-c324-47f4-b8af-f23613c1257e-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:31 crc kubenswrapper[4735]: I1008 13:08:31.107990 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tjn9c" event={"ID":"6c1a4018-c324-47f4-b8af-f23613c1257e","Type":"ContainerDied","Data":"187ff429438c985aa771e6a7fa022fe47b1897911533071f2c6d91155760e4fd"} Oct 08 13:08:31 crc kubenswrapper[4735]: I1008 13:08:31.108026 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="187ff429438c985aa771e6a7fa022fe47b1897911533071f2c6d91155760e4fd" Oct 08 13:08:31 crc kubenswrapper[4735]: I1008 13:08:31.108038 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tjn9c" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.651801 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl"] Oct 08 13:08:37 crc kubenswrapper[4735]: E1008 13:08:37.652429 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1a4018-c324-47f4-b8af-f23613c1257e" containerName="storage" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.652440 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1a4018-c324-47f4-b8af-f23613c1257e" containerName="storage" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.652572 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1a4018-c324-47f4-b8af-f23613c1257e" containerName="storage" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.653201 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.655423 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.669016 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl"] Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.755182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.755218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxq9t\" (UniqueName: \"kubernetes.io/projected/5084e332-dc3f-46c3-8e9b-9c2c7795611a-kube-api-access-dxq9t\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.755289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.856619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.857043 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.857234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxq9t\" (UniqueName: \"kubernetes.io/projected/5084e332-dc3f-46c3-8e9b-9c2c7795611a-kube-api-access-dxq9t\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.857291 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.857945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.880925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxq9t\" (UniqueName: \"kubernetes.io/projected/5084e332-dc3f-46c3-8e9b-9c2c7795611a-kube-api-access-dxq9t\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:37 crc kubenswrapper[4735]: I1008 13:08:37.974365 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:38 crc kubenswrapper[4735]: I1008 13:08:38.383117 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl"] Oct 08 13:08:39 crc kubenswrapper[4735]: I1008 13:08:39.156838 4735 generic.go:334] "Generic (PLEG): container finished" podID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerID="cee333ff979b09d4141a6adc5c6548484755d35ed8b55dafdc0e01dc5f075648" exitCode=0 Oct 08 13:08:39 crc kubenswrapper[4735]: I1008 13:08:39.156956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" event={"ID":"5084e332-dc3f-46c3-8e9b-9c2c7795611a","Type":"ContainerDied","Data":"cee333ff979b09d4141a6adc5c6548484755d35ed8b55dafdc0e01dc5f075648"} Oct 08 13:08:39 crc kubenswrapper[4735]: I1008 13:08:39.157231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" event={"ID":"5084e332-dc3f-46c3-8e9b-9c2c7795611a","Type":"ContainerStarted","Data":"6a04c564a693f87975a845de823421c7b5d68b2a6a9c2a16b7f1aefda2ed2dea"} Oct 08 13:08:39 crc kubenswrapper[4735]: I1008 13:08:39.173895 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m42qm" Oct 08 13:08:39 crc kubenswrapper[4735]: I1008 13:08:39.997641 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w7vft"] Oct 08 13:08:39 crc kubenswrapper[4735]: I1008 13:08:39.998615 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.038835 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7vft"] Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.089233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-catalog-content\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.089275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fv8q\" (UniqueName: \"kubernetes.io/projected/139344c9-3c05-4dc5-b700-daac6e83be20-kube-api-access-5fv8q\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.089340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-utilities\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.190534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-utilities\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.190588 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-catalog-content\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.190610 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fv8q\" (UniqueName: \"kubernetes.io/projected/139344c9-3c05-4dc5-b700-daac6e83be20-kube-api-access-5fv8q\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.191188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-utilities\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.191235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-catalog-content\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.212326 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fv8q\" (UniqueName: \"kubernetes.io/projected/139344c9-3c05-4dc5-b700-daac6e83be20-kube-api-access-5fv8q\") pod \"redhat-operators-w7vft\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.313115 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:40 crc kubenswrapper[4735]: I1008 13:08:40.768598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7vft"] Oct 08 13:08:41 crc kubenswrapper[4735]: I1008 13:08:41.167537 4735 generic.go:334] "Generic (PLEG): container finished" podID="139344c9-3c05-4dc5-b700-daac6e83be20" containerID="ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1" exitCode=0 Oct 08 13:08:41 crc kubenswrapper[4735]: I1008 13:08:41.167590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerDied","Data":"ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1"} Oct 08 13:08:41 crc kubenswrapper[4735]: I1008 13:08:41.167634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerStarted","Data":"a5588e25663f3c6ba244766be644380fdb11af4971dcf0df340fd293d9ffabe1"} Oct 08 13:08:41 crc kubenswrapper[4735]: I1008 13:08:41.170270 4735 generic.go:334] "Generic (PLEG): container finished" podID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerID="3a7bf609893c525a6b8468a821d3fcdbdebcda8a1684ad6258186ecb28fa8ed8" exitCode=0 Oct 08 13:08:41 crc kubenswrapper[4735]: I1008 13:08:41.170320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" event={"ID":"5084e332-dc3f-46c3-8e9b-9c2c7795611a","Type":"ContainerDied","Data":"3a7bf609893c525a6b8468a821d3fcdbdebcda8a1684ad6258186ecb28fa8ed8"} Oct 08 13:08:42 crc kubenswrapper[4735]: I1008 13:08:42.178339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerStarted","Data":"adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03"} Oct 08 13:08:42 crc kubenswrapper[4735]: I1008 13:08:42.181566 4735 generic.go:334] "Generic (PLEG): container finished" podID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerID="4f897fca6d0533d1fa9b4f687cd4c82debf68eb978b0e8e1187d2783d41cc1a4" exitCode=0 Oct 08 13:08:42 crc kubenswrapper[4735]: I1008 13:08:42.181611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" event={"ID":"5084e332-dc3f-46c3-8e9b-9c2c7795611a","Type":"ContainerDied","Data":"4f897fca6d0533d1fa9b4f687cd4c82debf68eb978b0e8e1187d2783d41cc1a4"} Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.192710 4735 generic.go:334] "Generic (PLEG): container finished" podID="139344c9-3c05-4dc5-b700-daac6e83be20" containerID="adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03" exitCode=0 Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.193600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerDied","Data":"adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03"} Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.595687 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.636913 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-bundle\") pod \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.637092 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-util\") pod \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.637240 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxq9t\" (UniqueName: \"kubernetes.io/projected/5084e332-dc3f-46c3-8e9b-9c2c7795611a-kube-api-access-dxq9t\") pod \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\" (UID: \"5084e332-dc3f-46c3-8e9b-9c2c7795611a\") " Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.637805 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-bundle" (OuterVolumeSpecName: "bundle") pod "5084e332-dc3f-46c3-8e9b-9c2c7795611a" (UID: "5084e332-dc3f-46c3-8e9b-9c2c7795611a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.643612 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5084e332-dc3f-46c3-8e9b-9c2c7795611a-kube-api-access-dxq9t" (OuterVolumeSpecName: "kube-api-access-dxq9t") pod "5084e332-dc3f-46c3-8e9b-9c2c7795611a" (UID: "5084e332-dc3f-46c3-8e9b-9c2c7795611a"). InnerVolumeSpecName "kube-api-access-dxq9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.661220 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-util" (OuterVolumeSpecName: "util") pod "5084e332-dc3f-46c3-8e9b-9c2c7795611a" (UID: "5084e332-dc3f-46c3-8e9b-9c2c7795611a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.738046 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.738075 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5084e332-dc3f-46c3-8e9b-9c2c7795611a-util\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:43 crc kubenswrapper[4735]: I1008 13:08:43.738085 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxq9t\" (UniqueName: \"kubernetes.io/projected/5084e332-dc3f-46c3-8e9b-9c2c7795611a-kube-api-access-dxq9t\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:44 crc kubenswrapper[4735]: I1008 13:08:44.205160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerStarted","Data":"ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c"} Oct 08 13:08:44 crc kubenswrapper[4735]: I1008 13:08:44.209187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" event={"ID":"5084e332-dc3f-46c3-8e9b-9c2c7795611a","Type":"ContainerDied","Data":"6a04c564a693f87975a845de823421c7b5d68b2a6a9c2a16b7f1aefda2ed2dea"} Oct 08 13:08:44 crc kubenswrapper[4735]: I1008 13:08:44.209226 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a04c564a693f87975a845de823421c7b5d68b2a6a9c2a16b7f1aefda2ed2dea" Oct 08 13:08:44 crc kubenswrapper[4735]: I1008 13:08:44.209283 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl" Oct 08 13:08:44 crc kubenswrapper[4735]: I1008 13:08:44.227654 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w7vft" podStartSLOduration=2.763153765 podStartE2EDuration="5.227634721s" podCreationTimestamp="2025-10-08 13:08:39 +0000 UTC" firstStartedPulling="2025-10-08 13:08:41.168861237 +0000 UTC m=+770.977165791" lastFinishedPulling="2025-10-08 13:08:43.633342213 +0000 UTC m=+773.441646747" observedRunningTime="2025-10-08 13:08:44.225346008 +0000 UTC m=+774.033650592" watchObservedRunningTime="2025-10-08 13:08:44.227634721 +0000 UTC m=+774.035939275" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.893364 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz"] Oct 08 13:08:47 crc kubenswrapper[4735]: E1008 13:08:47.894452 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="pull" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.894549 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="pull" Oct 08 13:08:47 crc kubenswrapper[4735]: E1008 13:08:47.894612 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="extract" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.894680 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="extract" Oct 08 13:08:47 crc kubenswrapper[4735]: E1008 13:08:47.894781 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="util" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.894834 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="util" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.894980 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5084e332-dc3f-46c3-8e9b-9c2c7795611a" containerName="extract" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.895449 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.897323 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rx629" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.897750 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.906059 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 08 13:08:47 crc kubenswrapper[4735]: I1008 13:08:47.921563 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz"] Oct 08 13:08:48 crc kubenswrapper[4735]: I1008 13:08:48.090705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lh6c\" (UniqueName: \"kubernetes.io/projected/0751005e-8ad1-4fc8-81f8-c0e7074fccd3-kube-api-access-2lh6c\") pod \"nmstate-operator-858ddd8f98-xvhhz\" (UID: \"0751005e-8ad1-4fc8-81f8-c0e7074fccd3\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" Oct 08 13:08:48 crc kubenswrapper[4735]: I1008 13:08:48.192182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lh6c\" (UniqueName: \"kubernetes.io/projected/0751005e-8ad1-4fc8-81f8-c0e7074fccd3-kube-api-access-2lh6c\") pod \"nmstate-operator-858ddd8f98-xvhhz\" (UID: \"0751005e-8ad1-4fc8-81f8-c0e7074fccd3\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" Oct 08 13:08:48 crc kubenswrapper[4735]: I1008 13:08:48.209445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lh6c\" (UniqueName: \"kubernetes.io/projected/0751005e-8ad1-4fc8-81f8-c0e7074fccd3-kube-api-access-2lh6c\") pod \"nmstate-operator-858ddd8f98-xvhhz\" (UID: \"0751005e-8ad1-4fc8-81f8-c0e7074fccd3\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" Oct 08 13:08:48 crc kubenswrapper[4735]: I1008 13:08:48.262850 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" Oct 08 13:08:48 crc kubenswrapper[4735]: I1008 13:08:48.711986 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz"] Oct 08 13:08:48 crc kubenswrapper[4735]: W1008 13:08:48.719343 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0751005e_8ad1_4fc8_81f8_c0e7074fccd3.slice/crio-3f8e956e8bec9b3f77921e2fb639f2beed1d20aab39f7bf4f4e57c44279ab7c3 WatchSource:0}: Error finding container 3f8e956e8bec9b3f77921e2fb639f2beed1d20aab39f7bf4f4e57c44279ab7c3: Status 404 returned error can't find the container with id 3f8e956e8bec9b3f77921e2fb639f2beed1d20aab39f7bf4f4e57c44279ab7c3 Oct 08 13:08:49 crc kubenswrapper[4735]: I1008 13:08:49.236930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" event={"ID":"0751005e-8ad1-4fc8-81f8-c0e7074fccd3","Type":"ContainerStarted","Data":"3f8e956e8bec9b3f77921e2fb639f2beed1d20aab39f7bf4f4e57c44279ab7c3"} Oct 08 13:08:50 crc kubenswrapper[4735]: I1008 13:08:50.313708 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:50 crc kubenswrapper[4735]: I1008 13:08:50.314026 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:50 crc kubenswrapper[4735]: I1008 13:08:50.366393 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:51 crc kubenswrapper[4735]: I1008 13:08:51.085210 4735 scope.go:117] "RemoveContainer" containerID="4a82f845f92e94e6df109ce1bf748570e0bcbe66989bc38894371604d541aa14" Oct 08 13:08:51 crc kubenswrapper[4735]: I1008 13:08:51.252348 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tfdqx_ef6f0aff-a0e5-477a-9392-71bc49bfa7aa/kube-multus/2.log" Oct 08 13:08:51 crc kubenswrapper[4735]: I1008 13:08:51.294843 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:52 crc kubenswrapper[4735]: I1008 13:08:52.260080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" event={"ID":"0751005e-8ad1-4fc8-81f8-c0e7074fccd3","Type":"ContainerStarted","Data":"7b15e222444a126104beb997c1ed53be40b9a4132b46ae471895704d6d6e0fca"} Oct 08 13:08:52 crc kubenswrapper[4735]: I1008 13:08:52.282854 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhhz" podStartSLOduration=2.902343752 podStartE2EDuration="5.282828652s" podCreationTimestamp="2025-10-08 13:08:47 +0000 UTC" firstStartedPulling="2025-10-08 13:08:48.721812005 +0000 UTC m=+778.530116549" lastFinishedPulling="2025-10-08 13:08:51.102296905 +0000 UTC m=+780.910601449" observedRunningTime="2025-10-08 13:08:52.279913523 +0000 UTC m=+782.088218067" watchObservedRunningTime="2025-10-08 13:08:52.282828652 +0000 UTC m=+782.091133236" Oct 08 13:08:52 crc kubenswrapper[4735]: I1008 13:08:52.989085 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w7vft"] Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.265707 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w7vft" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="registry-server" containerID="cri-o://ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c" gracePeriod=2 Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.698822 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.866148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-utilities\") pod \"139344c9-3c05-4dc5-b700-daac6e83be20\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.866232 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-catalog-content\") pod \"139344c9-3c05-4dc5-b700-daac6e83be20\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.866269 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fv8q\" (UniqueName: \"kubernetes.io/projected/139344c9-3c05-4dc5-b700-daac6e83be20-kube-api-access-5fv8q\") pod \"139344c9-3c05-4dc5-b700-daac6e83be20\" (UID: \"139344c9-3c05-4dc5-b700-daac6e83be20\") " Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.867601 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-utilities" (OuterVolumeSpecName: "utilities") pod "139344c9-3c05-4dc5-b700-daac6e83be20" (UID: "139344c9-3c05-4dc5-b700-daac6e83be20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.883601 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139344c9-3c05-4dc5-b700-daac6e83be20-kube-api-access-5fv8q" (OuterVolumeSpecName: "kube-api-access-5fv8q") pod "139344c9-3c05-4dc5-b700-daac6e83be20" (UID: "139344c9-3c05-4dc5-b700-daac6e83be20"). InnerVolumeSpecName "kube-api-access-5fv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.949225 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "139344c9-3c05-4dc5-b700-daac6e83be20" (UID: "139344c9-3c05-4dc5-b700-daac6e83be20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.967351 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.967384 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fv8q\" (UniqueName: \"kubernetes.io/projected/139344c9-3c05-4dc5-b700-daac6e83be20-kube-api-access-5fv8q\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:53 crc kubenswrapper[4735]: I1008 13:08:53.967396 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139344c9-3c05-4dc5-b700-daac6e83be20-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.274208 4735 generic.go:334] "Generic (PLEG): container finished" podID="139344c9-3c05-4dc5-b700-daac6e83be20" containerID="ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c" exitCode=0 Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.274266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerDied","Data":"ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c"} Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.274298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7vft" event={"ID":"139344c9-3c05-4dc5-b700-daac6e83be20","Type":"ContainerDied","Data":"a5588e25663f3c6ba244766be644380fdb11af4971dcf0df340fd293d9ffabe1"} Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.274319 4735 scope.go:117] "RemoveContainer" containerID="ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.274365 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7vft" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.290040 4735 scope.go:117] "RemoveContainer" containerID="adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.301318 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w7vft"] Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.306542 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w7vft"] Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.307400 4735 scope.go:117] "RemoveContainer" containerID="ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.327056 4735 scope.go:117] "RemoveContainer" containerID="ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c" Oct 08 13:08:54 crc kubenswrapper[4735]: E1008 13:08:54.327680 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c\": container with ID starting with ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c not found: ID does not exist" containerID="ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.327722 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c"} err="failed to get container status \"ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c\": rpc error: code = NotFound desc = could not find container \"ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c\": container with ID starting with ddcc50bc0276af086d575724afa614e73aeece8512ee624097d86254bae84a9c not found: ID does not exist" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.327747 4735 scope.go:117] "RemoveContainer" containerID="adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03" Oct 08 13:08:54 crc kubenswrapper[4735]: E1008 13:08:54.328321 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03\": container with ID starting with adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03 not found: ID does not exist" containerID="adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.328356 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03"} err="failed to get container status \"adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03\": rpc error: code = NotFound desc = could not find container \"adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03\": container with ID starting with adee77f3398fcf588de58017d5aecd78bbda2666805221d82d81913e1c0a9a03 not found: ID does not exist" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.328377 4735 scope.go:117] "RemoveContainer" containerID="ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1" Oct 08 13:08:54 crc kubenswrapper[4735]: E1008 13:08:54.328645 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1\": container with ID starting with ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1 not found: ID does not exist" containerID="ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.328677 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1"} err="failed to get container status \"ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1\": rpc error: code = NotFound desc = could not find container \"ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1\": container with ID starting with ffe50c92533c06815008768afec27ab8ea04189bf2b4718ae3a6b564279fecb1 not found: ID does not exist" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.878292 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.878377 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:08:54 crc kubenswrapper[4735]: I1008 13:08:54.904097 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" path="/var/lib/kubelet/pods/139344c9-3c05-4dc5-b700-daac6e83be20/volumes" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.880517 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7"] Oct 08 13:08:57 crc kubenswrapper[4735]: E1008 13:08:57.880892 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="extract-utilities" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.880905 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="extract-utilities" Oct 08 13:08:57 crc kubenswrapper[4735]: E1008 13:08:57.880914 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="registry-server" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.880920 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="registry-server" Oct 08 13:08:57 crc kubenswrapper[4735]: E1008 13:08:57.880934 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="extract-content" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.880942 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="extract-content" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.881027 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="139344c9-3c05-4dc5-b700-daac6e83be20" containerName="registry-server" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.881510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.889642 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2hdn6" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.892508 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7"] Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.902915 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp"] Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.903714 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.917323 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dbz7p"] Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.918226 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.921697 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 08 13:08:57 crc kubenswrapper[4735]: I1008 13:08:57.928572 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp"] Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.011350 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v"] Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.012160 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.013968 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.014359 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.017119 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rqx62" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.022916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrjbs\" (UniqueName: \"kubernetes.io/projected/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-kube-api-access-xrjbs\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.022990 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-nmstate-lock\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.023032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-dbus-socket\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.023051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqhk\" (UniqueName: \"kubernetes.io/projected/f34fd5b9-a3a6-47e7-8070-857df4befa39-kube-api-access-9sqhk\") pod \"nmstate-metrics-fdff9cb8d-cjhw7\" (UID: \"f34fd5b9-a3a6-47e7-8070-857df4befa39\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.023083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aac9fb9c-bf97-4090-b212-ea396b0443c4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-r9lhp\" (UID: \"aac9fb9c-bf97-4090-b212-ea396b0443c4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.023151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cppg\" (UniqueName: \"kubernetes.io/projected/aac9fb9c-bf97-4090-b212-ea396b0443c4-kube-api-access-7cppg\") pod \"nmstate-webhook-6cdbc54649-r9lhp\" (UID: \"aac9fb9c-bf97-4090-b212-ea396b0443c4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.023190 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-ovs-socket\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.023532 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v"] Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-nmstate-lock\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-dbus-socket\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124248 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqhk\" (UniqueName: \"kubernetes.io/projected/f34fd5b9-a3a6-47e7-8070-857df4befa39-kube-api-access-9sqhk\") pod \"nmstate-metrics-fdff9cb8d-cjhw7\" (UID: \"f34fd5b9-a3a6-47e7-8070-857df4befa39\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124283 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aac9fb9c-bf97-4090-b212-ea396b0443c4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-r9lhp\" (UID: \"aac9fb9c-bf97-4090-b212-ea396b0443c4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/98be5146-db56-4a21-9ffb-9875c6e07a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/98be5146-db56-4a21-9ffb-9875c6e07a0d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cppg\" (UniqueName: \"kubernetes.io/projected/aac9fb9c-bf97-4090-b212-ea396b0443c4-kube-api-access-7cppg\") pod \"nmstate-webhook-6cdbc54649-r9lhp\" (UID: \"aac9fb9c-bf97-4090-b212-ea396b0443c4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-ovs-socket\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfsnx\" (UniqueName: \"kubernetes.io/projected/98be5146-db56-4a21-9ffb-9875c6e07a0d-kube-api-access-bfsnx\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124500 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrjbs\" (UniqueName: \"kubernetes.io/projected/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-kube-api-access-xrjbs\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.124841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-nmstate-lock\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.125131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-dbus-socket\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.126640 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-ovs-socket\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.130039 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aac9fb9c-bf97-4090-b212-ea396b0443c4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-r9lhp\" (UID: \"aac9fb9c-bf97-4090-b212-ea396b0443c4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.142866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrjbs\" (UniqueName: \"kubernetes.io/projected/6cc8c3dc-00ba-45e7-9d57-6ad40a580c16-kube-api-access-xrjbs\") pod \"nmstate-handler-dbz7p\" (UID: \"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16\") " pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.146121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqhk\" (UniqueName: \"kubernetes.io/projected/f34fd5b9-a3a6-47e7-8070-857df4befa39-kube-api-access-9sqhk\") pod \"nmstate-metrics-fdff9cb8d-cjhw7\" (UID: \"f34fd5b9-a3a6-47e7-8070-857df4befa39\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.157025 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cppg\" (UniqueName: \"kubernetes.io/projected/aac9fb9c-bf97-4090-b212-ea396b0443c4-kube-api-access-7cppg\") pod \"nmstate-webhook-6cdbc54649-r9lhp\" (UID: \"aac9fb9c-bf97-4090-b212-ea396b0443c4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.201084 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.216503 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6467ddc885-wbjr7"] Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.217174 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.229083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfsnx\" (UniqueName: \"kubernetes.io/projected/98be5146-db56-4a21-9ffb-9875c6e07a0d-kube-api-access-bfsnx\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.229167 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/98be5146-db56-4a21-9ffb-9875c6e07a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.229188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/98be5146-db56-4a21-9ffb-9875c6e07a0d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.230021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/98be5146-db56-4a21-9ffb-9875c6e07a0d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: E1008 13:08:58.230313 4735 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 08 13:08:58 crc kubenswrapper[4735]: E1008 13:08:58.230351 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98be5146-db56-4a21-9ffb-9875c6e07a0d-plugin-serving-cert podName:98be5146-db56-4a21-9ffb-9875c6e07a0d nodeName:}" failed. No retries permitted until 2025-10-08 13:08:58.730339713 +0000 UTC m=+788.538644257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/98be5146-db56-4a21-9ffb-9875c6e07a0d-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-mz79v" (UID: "98be5146-db56-4a21-9ffb-9875c6e07a0d") : secret "plugin-serving-cert" not found Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.230919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.233322 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6467ddc885-wbjr7"] Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.234602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.249673 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfsnx\" (UniqueName: \"kubernetes.io/projected/98be5146-db56-4a21-9ffb-9875c6e07a0d-kube-api-access-bfsnx\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.299403 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dbz7p" event={"ID":"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16","Type":"ContainerStarted","Data":"bba0ae30f7761019ef284f1337faff7a9b2bf84544c4b64f9122edfef5553c89"} Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-oauth-config\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-trusted-ca-bundle\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330444 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-service-ca\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330469 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-oauth-serving-cert\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330522 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-config\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-serving-cert\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.330777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gg6r\" (UniqueName: \"kubernetes.io/projected/d75e6c14-36aa-4fd6-951e-3d508e54c024-kube-api-access-5gg6r\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432342 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-serving-cert\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432401 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gg6r\" (UniqueName: \"kubernetes.io/projected/d75e6c14-36aa-4fd6-951e-3d508e54c024-kube-api-access-5gg6r\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432423 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-oauth-config\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-trusted-ca-bundle\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-service-ca\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-oauth-serving-cert\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.432525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-config\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.433276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-config\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.434168 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-trusted-ca-bundle\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.434427 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-service-ca\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.434662 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d75e6c14-36aa-4fd6-951e-3d508e54c024-oauth-serving-cert\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.439319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-serving-cert\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.446929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d75e6c14-36aa-4fd6-951e-3d508e54c024-console-oauth-config\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.448967 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gg6r\" (UniqueName: \"kubernetes.io/projected/d75e6c14-36aa-4fd6-951e-3d508e54c024-kube-api-access-5gg6r\") pod \"console-6467ddc885-wbjr7\" (UID: \"d75e6c14-36aa-4fd6-951e-3d508e54c024\") " pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.594684 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.635322 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp"] Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.678661 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7"] Oct 08 13:08:58 crc kubenswrapper[4735]: W1008 13:08:58.701301 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34fd5b9_a3a6_47e7_8070_857df4befa39.slice/crio-aa577595816596405303f9b29a9778273f7b43f3cb70cd41c92c04749f1bc8ce WatchSource:0}: Error finding container aa577595816596405303f9b29a9778273f7b43f3cb70cd41c92c04749f1bc8ce: Status 404 returned error can't find the container with id aa577595816596405303f9b29a9778273f7b43f3cb70cd41c92c04749f1bc8ce Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.737329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/98be5146-db56-4a21-9ffb-9875c6e07a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.740300 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/98be5146-db56-4a21-9ffb-9875c6e07a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-mz79v\" (UID: \"98be5146-db56-4a21-9ffb-9875c6e07a0d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:58 crc kubenswrapper[4735]: I1008 13:08:58.927203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.064056 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6467ddc885-wbjr7"] Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.308214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" event={"ID":"f34fd5b9-a3a6-47e7-8070-857df4befa39","Type":"ContainerStarted","Data":"aa577595816596405303f9b29a9778273f7b43f3cb70cd41c92c04749f1bc8ce"} Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.315192 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6467ddc885-wbjr7" event={"ID":"d75e6c14-36aa-4fd6-951e-3d508e54c024","Type":"ContainerStarted","Data":"6c9b9903b1185a9bf7ea9a0a956bb2d59a303b70a7554ba59b4beb833503f661"} Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.315280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6467ddc885-wbjr7" event={"ID":"d75e6c14-36aa-4fd6-951e-3d508e54c024","Type":"ContainerStarted","Data":"2739d086f28bbb40393c2b319d2e96c91b053420a177bb5c89404421adafae65"} Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.317300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" event={"ID":"aac9fb9c-bf97-4090-b212-ea396b0443c4","Type":"ContainerStarted","Data":"b76efa091541550126f7b00d2c37b062cba75b6dcd93436c3dccfd23f31a6fb0"} Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.318233 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v"] Oct 08 13:08:59 crc kubenswrapper[4735]: W1008 13:08:59.329385 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98be5146_db56_4a21_9ffb_9875c6e07a0d.slice/crio-c20dfcfdfe72386362c437500a3da37c01ac73c8d0218a93592559ae3b9eee79 WatchSource:0}: Error finding container c20dfcfdfe72386362c437500a3da37c01ac73c8d0218a93592559ae3b9eee79: Status 404 returned error can't find the container with id c20dfcfdfe72386362c437500a3da37c01ac73c8d0218a93592559ae3b9eee79 Oct 08 13:08:59 crc kubenswrapper[4735]: I1008 13:08:59.336586 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6467ddc885-wbjr7" podStartSLOduration=1.336563008 podStartE2EDuration="1.336563008s" podCreationTimestamp="2025-10-08 13:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:08:59.332249531 +0000 UTC m=+789.140554095" watchObservedRunningTime="2025-10-08 13:08:59.336563008 +0000 UTC m=+789.144867572" Oct 08 13:09:00 crc kubenswrapper[4735]: I1008 13:09:00.326366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" event={"ID":"98be5146-db56-4a21-9ffb-9875c6e07a0d","Type":"ContainerStarted","Data":"c20dfcfdfe72386362c437500a3da37c01ac73c8d0218a93592559ae3b9eee79"} Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.333121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dbz7p" event={"ID":"6cc8c3dc-00ba-45e7-9d57-6ad40a580c16","Type":"ContainerStarted","Data":"852931e0288593b6cd9e02c59a5958505b17f52e54e0b8fe737dee8ed9b9c716"} Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.333548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.334859 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" event={"ID":"aac9fb9c-bf97-4090-b212-ea396b0443c4","Type":"ContainerStarted","Data":"fbff9c3c5005c6293f36e71cf1945bd27dd0a802d9f8dbe5fc4e02ad3c2748c5"} Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.334980 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.338365 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" event={"ID":"f34fd5b9-a3a6-47e7-8070-857df4befa39","Type":"ContainerStarted","Data":"25b70c40e8b1d226ba46e5db304b46ea9c6d4330fec4b7d8668d053d0713e4fb"} Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.351648 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dbz7p" podStartSLOduration=1.8253668699999999 podStartE2EDuration="4.351631037s" podCreationTimestamp="2025-10-08 13:08:57 +0000 UTC" firstStartedPulling="2025-10-08 13:08:58.285578326 +0000 UTC m=+788.093882870" lastFinishedPulling="2025-10-08 13:09:00.811842483 +0000 UTC m=+790.620147037" observedRunningTime="2025-10-08 13:09:01.346116787 +0000 UTC m=+791.154421331" watchObservedRunningTime="2025-10-08 13:09:01.351631037 +0000 UTC m=+791.159935581" Oct 08 13:09:01 crc kubenswrapper[4735]: I1008 13:09:01.363007 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" podStartSLOduration=2.18193831 podStartE2EDuration="4.362987156s" podCreationTimestamp="2025-10-08 13:08:57 +0000 UTC" firstStartedPulling="2025-10-08 13:08:58.654708908 +0000 UTC m=+788.463013452" lastFinishedPulling="2025-10-08 13:09:00.835757754 +0000 UTC m=+790.644062298" observedRunningTime="2025-10-08 13:09:01.360223931 +0000 UTC m=+791.168528475" watchObservedRunningTime="2025-10-08 13:09:01.362987156 +0000 UTC m=+791.171291690" Oct 08 13:09:02 crc kubenswrapper[4735]: I1008 13:09:02.344750 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" event={"ID":"98be5146-db56-4a21-9ffb-9875c6e07a0d","Type":"ContainerStarted","Data":"6dc3324911afd03b47740c542ea008f5a282bcabef26205de3b26c9935a94078"} Oct 08 13:09:02 crc kubenswrapper[4735]: I1008 13:09:02.361846 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-mz79v" podStartSLOduration=3.013746991 podStartE2EDuration="5.36182772s" podCreationTimestamp="2025-10-08 13:08:57 +0000 UTC" firstStartedPulling="2025-10-08 13:08:59.334764349 +0000 UTC m=+789.143068903" lastFinishedPulling="2025-10-08 13:09:01.682845078 +0000 UTC m=+791.491149632" observedRunningTime="2025-10-08 13:09:02.359864896 +0000 UTC m=+792.168169460" watchObservedRunningTime="2025-10-08 13:09:02.36182772 +0000 UTC m=+792.170132264" Oct 08 13:09:03 crc kubenswrapper[4735]: I1008 13:09:03.355175 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" event={"ID":"f34fd5b9-a3a6-47e7-8070-857df4befa39","Type":"ContainerStarted","Data":"9911a0ad705b1dd1dddddfb12ccb0692ce2e340243f82ef6ba715dad5d13a468"} Oct 08 13:09:08 crc kubenswrapper[4735]: I1008 13:09:08.269992 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dbz7p" Oct 08 13:09:08 crc kubenswrapper[4735]: I1008 13:09:08.294923 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cjhw7" podStartSLOduration=6.972387405 podStartE2EDuration="11.294892858s" podCreationTimestamp="2025-10-08 13:08:57 +0000 UTC" firstStartedPulling="2025-10-08 13:08:58.71101452 +0000 UTC m=+788.519319064" lastFinishedPulling="2025-10-08 13:09:03.033519973 +0000 UTC m=+792.841824517" observedRunningTime="2025-10-08 13:09:03.382714323 +0000 UTC m=+793.191019037" watchObservedRunningTime="2025-10-08 13:09:08.294892858 +0000 UTC m=+798.103197432" Oct 08 13:09:08 crc kubenswrapper[4735]: I1008 13:09:08.596368 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:09:08 crc kubenswrapper[4735]: I1008 13:09:08.596423 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:09:08 crc kubenswrapper[4735]: I1008 13:09:08.601569 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:09:09 crc kubenswrapper[4735]: I1008 13:09:09.397442 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6467ddc885-wbjr7" Oct 08 13:09:09 crc kubenswrapper[4735]: I1008 13:09:09.483539 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nnp5j"] Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.454882 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g4rk4"] Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.456534 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.472037 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4rk4"] Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.626763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-utilities\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.626872 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm49h\" (UniqueName: \"kubernetes.io/projected/fba7b386-8f82-48f8-8060-797a0d88d0b9-kube-api-access-hm49h\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.626911 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-catalog-content\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.728671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm49h\" (UniqueName: \"kubernetes.io/projected/fba7b386-8f82-48f8-8060-797a0d88d0b9-kube-api-access-hm49h\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.728733 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-catalog-content\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.728796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-utilities\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.729293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-utilities\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.729927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-catalog-content\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.749674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm49h\" (UniqueName: \"kubernetes.io/projected/fba7b386-8f82-48f8-8060-797a0d88d0b9-kube-api-access-hm49h\") pod \"redhat-marketplace-g4rk4\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:12 crc kubenswrapper[4735]: I1008 13:09:12.788361 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:13 crc kubenswrapper[4735]: I1008 13:09:13.218051 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4rk4"] Oct 08 13:09:13 crc kubenswrapper[4735]: I1008 13:09:13.419846 4735 generic.go:334] "Generic (PLEG): container finished" podID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerID="6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e" exitCode=0 Oct 08 13:09:13 crc kubenswrapper[4735]: I1008 13:09:13.419885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4rk4" event={"ID":"fba7b386-8f82-48f8-8060-797a0d88d0b9","Type":"ContainerDied","Data":"6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e"} Oct 08 13:09:13 crc kubenswrapper[4735]: I1008 13:09:13.419909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4rk4" event={"ID":"fba7b386-8f82-48f8-8060-797a0d88d0b9","Type":"ContainerStarted","Data":"60d42393e58ae96f9fee1795b3d2e4f368e2ebf83fd99f66a796735c66852a01"} Oct 08 13:09:14 crc kubenswrapper[4735]: I1008 13:09:14.426769 4735 generic.go:334] "Generic (PLEG): container finished" podID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerID="0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad" exitCode=0 Oct 08 13:09:14 crc kubenswrapper[4735]: I1008 13:09:14.426939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4rk4" event={"ID":"fba7b386-8f82-48f8-8060-797a0d88d0b9","Type":"ContainerDied","Data":"0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad"} Oct 08 13:09:15 crc kubenswrapper[4735]: I1008 13:09:15.437710 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4rk4" event={"ID":"fba7b386-8f82-48f8-8060-797a0d88d0b9","Type":"ContainerStarted","Data":"7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3"} Oct 08 13:09:15 crc kubenswrapper[4735]: I1008 13:09:15.464809 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g4rk4" podStartSLOduration=2.037257237 podStartE2EDuration="3.464792553s" podCreationTimestamp="2025-10-08 13:09:12 +0000 UTC" firstStartedPulling="2025-10-08 13:09:13.421975089 +0000 UTC m=+803.230279633" lastFinishedPulling="2025-10-08 13:09:14.849510405 +0000 UTC m=+804.657814949" observedRunningTime="2025-10-08 13:09:15.461116073 +0000 UTC m=+805.269420667" watchObservedRunningTime="2025-10-08 13:09:15.464792553 +0000 UTC m=+805.273097097" Oct 08 13:09:18 crc kubenswrapper[4735]: I1008 13:09:18.241996 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-r9lhp" Oct 08 13:09:22 crc kubenswrapper[4735]: I1008 13:09:22.789787 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:22 crc kubenswrapper[4735]: I1008 13:09:22.790098 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:22 crc kubenswrapper[4735]: I1008 13:09:22.824298 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:23 crc kubenswrapper[4735]: I1008 13:09:23.519049 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:23 crc kubenswrapper[4735]: I1008 13:09:23.561509 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4rk4"] Oct 08 13:09:24 crc kubenswrapper[4735]: I1008 13:09:24.878498 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:09:24 crc kubenswrapper[4735]: I1008 13:09:24.878793 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.493136 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g4rk4" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="registry-server" containerID="cri-o://7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3" gracePeriod=2 Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.878769 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.993467 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-catalog-content\") pod \"fba7b386-8f82-48f8-8060-797a0d88d0b9\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.993576 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-utilities\") pod \"fba7b386-8f82-48f8-8060-797a0d88d0b9\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.994128 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm49h\" (UniqueName: \"kubernetes.io/projected/fba7b386-8f82-48f8-8060-797a0d88d0b9-kube-api-access-hm49h\") pod \"fba7b386-8f82-48f8-8060-797a0d88d0b9\" (UID: \"fba7b386-8f82-48f8-8060-797a0d88d0b9\") " Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.994567 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-utilities" (OuterVolumeSpecName: "utilities") pod "fba7b386-8f82-48f8-8060-797a0d88d0b9" (UID: "fba7b386-8f82-48f8-8060-797a0d88d0b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:09:25 crc kubenswrapper[4735]: I1008 13:09:25.994940 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.008098 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba7b386-8f82-48f8-8060-797a0d88d0b9-kube-api-access-hm49h" (OuterVolumeSpecName: "kube-api-access-hm49h") pod "fba7b386-8f82-48f8-8060-797a0d88d0b9" (UID: "fba7b386-8f82-48f8-8060-797a0d88d0b9"). InnerVolumeSpecName "kube-api-access-hm49h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.009774 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fba7b386-8f82-48f8-8060-797a0d88d0b9" (UID: "fba7b386-8f82-48f8-8060-797a0d88d0b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.095704 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fba7b386-8f82-48f8-8060-797a0d88d0b9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.096086 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm49h\" (UniqueName: \"kubernetes.io/projected/fba7b386-8f82-48f8-8060-797a0d88d0b9-kube-api-access-hm49h\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.504221 4735 generic.go:334] "Generic (PLEG): container finished" podID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerID="7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3" exitCode=0 Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.504263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4rk4" event={"ID":"fba7b386-8f82-48f8-8060-797a0d88d0b9","Type":"ContainerDied","Data":"7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3"} Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.504291 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4rk4" event={"ID":"fba7b386-8f82-48f8-8060-797a0d88d0b9","Type":"ContainerDied","Data":"60d42393e58ae96f9fee1795b3d2e4f368e2ebf83fd99f66a796735c66852a01"} Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.504300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4rk4" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.504311 4735 scope.go:117] "RemoveContainer" containerID="7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.526629 4735 scope.go:117] "RemoveContainer" containerID="0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.539575 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4rk4"] Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.544309 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4rk4"] Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.565218 4735 scope.go:117] "RemoveContainer" containerID="6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.579981 4735 scope.go:117] "RemoveContainer" containerID="7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3" Oct 08 13:09:26 crc kubenswrapper[4735]: E1008 13:09:26.580652 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3\": container with ID starting with 7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3 not found: ID does not exist" containerID="7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.580828 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3"} err="failed to get container status \"7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3\": rpc error: code = NotFound desc = could not find container \"7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3\": container with ID starting with 7d0927739d0453b330404a8fee4ef28af07332fbcbda963edf3d7287542253b3 not found: ID does not exist" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.580980 4735 scope.go:117] "RemoveContainer" containerID="0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad" Oct 08 13:09:26 crc kubenswrapper[4735]: E1008 13:09:26.581530 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad\": container with ID starting with 0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad not found: ID does not exist" containerID="0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.581639 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad"} err="failed to get container status \"0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad\": rpc error: code = NotFound desc = could not find container \"0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad\": container with ID starting with 0ab9a795cace9ce9beaea7100d49aacd763b6746edee6e4ea44574dc0ff09bad not found: ID does not exist" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.581731 4735 scope.go:117] "RemoveContainer" containerID="6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e" Oct 08 13:09:26 crc kubenswrapper[4735]: E1008 13:09:26.582187 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e\": container with ID starting with 6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e not found: ID does not exist" containerID="6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.582237 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e"} err="failed to get container status \"6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e\": rpc error: code = NotFound desc = could not find container \"6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e\": container with ID starting with 6a8622bb5c20741a610fd0baa9560b3ae109f5d1b27f8f434a9aea3c02be3c3e not found: ID does not exist" Oct 08 13:09:26 crc kubenswrapper[4735]: I1008 13:09:26.901958 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" path="/var/lib/kubelet/pods/fba7b386-8f82-48f8-8060-797a0d88d0b9/volumes" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.385168 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj"] Oct 08 13:09:30 crc kubenswrapper[4735]: E1008 13:09:30.385976 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="extract-utilities" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.385993 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="extract-utilities" Oct 08 13:09:30 crc kubenswrapper[4735]: E1008 13:09:30.386004 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="extract-content" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.386011 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="extract-content" Oct 08 13:09:30 crc kubenswrapper[4735]: E1008 13:09:30.386036 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="registry-server" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.386044 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="registry-server" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.386161 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba7b386-8f82-48f8-8060-797a0d88d0b9" containerName="registry-server" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.387082 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.388752 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.395190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj"] Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.444144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.444203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.444295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2z8\" (UniqueName: \"kubernetes.io/projected/ea47ffd3-7063-48a5-82ca-0121e24a845e-kube-api-access-cm2z8\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.544826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2z8\" (UniqueName: \"kubernetes.io/projected/ea47ffd3-7063-48a5-82ca-0121e24a845e-kube-api-access-cm2z8\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.544899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.544925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.545396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.545425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.562647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2z8\" (UniqueName: \"kubernetes.io/projected/ea47ffd3-7063-48a5-82ca-0121e24a845e-kube-api-access-cm2z8\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:30 crc kubenswrapper[4735]: I1008 13:09:30.703659 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:31 crc kubenswrapper[4735]: I1008 13:09:31.084766 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj"] Oct 08 13:09:31 crc kubenswrapper[4735]: W1008 13:09:31.091417 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea47ffd3_7063_48a5_82ca_0121e24a845e.slice/crio-45fae118bf289fa8d2ddb3612e2fa6d952ebd8859f1f8847acc50fc4f2cec458 WatchSource:0}: Error finding container 45fae118bf289fa8d2ddb3612e2fa6d952ebd8859f1f8847acc50fc4f2cec458: Status 404 returned error can't find the container with id 45fae118bf289fa8d2ddb3612e2fa6d952ebd8859f1f8847acc50fc4f2cec458 Oct 08 13:09:31 crc kubenswrapper[4735]: I1008 13:09:31.539329 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerID="dfafbc274ccec0996ec91fc874ff5143082905cd724c7e85d57f403ad16bb607" exitCode=0 Oct 08 13:09:31 crc kubenswrapper[4735]: I1008 13:09:31.539518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" event={"ID":"ea47ffd3-7063-48a5-82ca-0121e24a845e","Type":"ContainerDied","Data":"dfafbc274ccec0996ec91fc874ff5143082905cd724c7e85d57f403ad16bb607"} Oct 08 13:09:31 crc kubenswrapper[4735]: I1008 13:09:31.539633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" event={"ID":"ea47ffd3-7063-48a5-82ca-0121e24a845e","Type":"ContainerStarted","Data":"45fae118bf289fa8d2ddb3612e2fa6d952ebd8859f1f8847acc50fc4f2cec458"} Oct 08 13:09:33 crc kubenswrapper[4735]: I1008 13:09:33.552437 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerID="fd06b8b43c28e7d3186d0af61f629dad34c35f42d353f52b2f71ac9dd3c59d9d" exitCode=0 Oct 08 13:09:33 crc kubenswrapper[4735]: I1008 13:09:33.552515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" event={"ID":"ea47ffd3-7063-48a5-82ca-0121e24a845e","Type":"ContainerDied","Data":"fd06b8b43c28e7d3186d0af61f629dad34c35f42d353f52b2f71ac9dd3c59d9d"} Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.523225 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nnp5j" podUID="ecbba0bb-2230-4624-8185-5d116632343f" containerName="console" containerID="cri-o://62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233" gracePeriod=15 Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.560148 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerID="8d853b25d0d4ccbffa6bcc20390244ecf4e03498fc144145d7ab6a423c8fb8fe" exitCode=0 Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.560195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" event={"ID":"ea47ffd3-7063-48a5-82ca-0121e24a845e","Type":"ContainerDied","Data":"8d853b25d0d4ccbffa6bcc20390244ecf4e03498fc144145d7ab6a423c8fb8fe"} Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.861662 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nnp5j_ecbba0bb-2230-4624-8185-5d116632343f/console/0.log" Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.861765 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.994849 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-oauth-config\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.995205 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-console-config\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.995253 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-serving-cert\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.995286 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-oauth-serving-cert\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.995314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-trusted-ca-bundle\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.995367 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhj9q\" (UniqueName: \"kubernetes.io/projected/ecbba0bb-2230-4624-8185-5d116632343f-kube-api-access-dhj9q\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.995407 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-service-ca\") pod \"ecbba0bb-2230-4624-8185-5d116632343f\" (UID: \"ecbba0bb-2230-4624-8185-5d116632343f\") " Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.996151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-service-ca" (OuterVolumeSpecName: "service-ca") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.996356 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-console-config" (OuterVolumeSpecName: "console-config") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.996352 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:09:34 crc kubenswrapper[4735]: I1008 13:09:34.996425 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.001557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.001917 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.008087 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbba0bb-2230-4624-8185-5d116632343f-kube-api-access-dhj9q" (OuterVolumeSpecName: "kube-api-access-dhj9q") pod "ecbba0bb-2230-4624-8185-5d116632343f" (UID: "ecbba0bb-2230-4624-8185-5d116632343f"). InnerVolumeSpecName "kube-api-access-dhj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096126 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096171 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecbba0bb-2230-4624-8185-5d116632343f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096185 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096198 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096212 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096224 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhj9q\" (UniqueName: \"kubernetes.io/projected/ecbba0bb-2230-4624-8185-5d116632343f-kube-api-access-dhj9q\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.096237 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecbba0bb-2230-4624-8185-5d116632343f-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.569090 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nnp5j_ecbba0bb-2230-4624-8185-5d116632343f/console/0.log" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.569171 4735 generic.go:334] "Generic (PLEG): container finished" podID="ecbba0bb-2230-4624-8185-5d116632343f" containerID="62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233" exitCode=2 Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.569464 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nnp5j" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.570281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnp5j" event={"ID":"ecbba0bb-2230-4624-8185-5d116632343f","Type":"ContainerDied","Data":"62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233"} Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.570364 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nnp5j" event={"ID":"ecbba0bb-2230-4624-8185-5d116632343f","Type":"ContainerDied","Data":"dd61ad4d430fee166999d1361e0bd2ee2bda1339e6bdde16ca238a95cc1d9548"} Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.570389 4735 scope.go:117] "RemoveContainer" containerID="62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.591547 4735 scope.go:117] "RemoveContainer" containerID="62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233" Oct 08 13:09:35 crc kubenswrapper[4735]: E1008 13:09:35.592133 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233\": container with ID starting with 62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233 not found: ID does not exist" containerID="62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.592179 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233"} err="failed to get container status \"62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233\": rpc error: code = NotFound desc = could not find container \"62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233\": container with ID starting with 62b5627c183bc7d9fe412f98b9322cb565e9b1766cdb8bbbb368d29d09f30233 not found: ID does not exist" Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.600529 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nnp5j"] Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.603059 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nnp5j"] Oct 08 13:09:35 crc kubenswrapper[4735]: I1008 13:09:35.858694 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.008080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm2z8\" (UniqueName: \"kubernetes.io/projected/ea47ffd3-7063-48a5-82ca-0121e24a845e-kube-api-access-cm2z8\") pod \"ea47ffd3-7063-48a5-82ca-0121e24a845e\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.008138 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-bundle\") pod \"ea47ffd3-7063-48a5-82ca-0121e24a845e\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.008161 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-util\") pod \"ea47ffd3-7063-48a5-82ca-0121e24a845e\" (UID: \"ea47ffd3-7063-48a5-82ca-0121e24a845e\") " Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.009333 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-bundle" (OuterVolumeSpecName: "bundle") pod "ea47ffd3-7063-48a5-82ca-0121e24a845e" (UID: "ea47ffd3-7063-48a5-82ca-0121e24a845e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.015323 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea47ffd3-7063-48a5-82ca-0121e24a845e-kube-api-access-cm2z8" (OuterVolumeSpecName: "kube-api-access-cm2z8") pod "ea47ffd3-7063-48a5-82ca-0121e24a845e" (UID: "ea47ffd3-7063-48a5-82ca-0121e24a845e"). InnerVolumeSpecName "kube-api-access-cm2z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.034231 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-util" (OuterVolumeSpecName: "util") pod "ea47ffd3-7063-48a5-82ca-0121e24a845e" (UID: "ea47ffd3-7063-48a5-82ca-0121e24a845e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.110391 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm2z8\" (UniqueName: \"kubernetes.io/projected/ea47ffd3-7063-48a5-82ca-0121e24a845e-kube-api-access-cm2z8\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.110444 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.110456 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea47ffd3-7063-48a5-82ca-0121e24a845e-util\") on node \"crc\" DevicePath \"\"" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.578054 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" event={"ID":"ea47ffd3-7063-48a5-82ca-0121e24a845e","Type":"ContainerDied","Data":"45fae118bf289fa8d2ddb3612e2fa6d952ebd8859f1f8847acc50fc4f2cec458"} Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.578101 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45fae118bf289fa8d2ddb3612e2fa6d952ebd8859f1f8847acc50fc4f2cec458" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.578076 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj" Oct 08 13:09:36 crc kubenswrapper[4735]: I1008 13:09:36.904514 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbba0bb-2230-4624-8185-5d116632343f" path="/var/lib/kubelet/pods/ecbba0bb-2230-4624-8185-5d116632343f/volumes" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.280354 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq"] Oct 08 13:09:45 crc kubenswrapper[4735]: E1008 13:09:45.281152 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="util" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281168 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="util" Oct 08 13:09:45 crc kubenswrapper[4735]: E1008 13:09:45.281183 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="extract" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281194 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="extract" Oct 08 13:09:45 crc kubenswrapper[4735]: E1008 13:09:45.281205 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbba0bb-2230-4624-8185-5d116632343f" containerName="console" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281214 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbba0bb-2230-4624-8185-5d116632343f" containerName="console" Oct 08 13:09:45 crc kubenswrapper[4735]: E1008 13:09:45.281237 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="pull" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281244 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="pull" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281353 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbba0bb-2230-4624-8185-5d116632343f" containerName="console" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281377 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea47ffd3-7063-48a5-82ca-0121e24a845e" containerName="extract" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.281941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.285284 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.285340 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.285550 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.285812 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.291355 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tz62n" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.303169 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq"] Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.417792 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/492d2d30-e201-405e-97aa-5457a763411d-apiservice-cert\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.418430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/492d2d30-e201-405e-97aa-5457a763411d-webhook-cert\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.418571 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqtck\" (UniqueName: \"kubernetes.io/projected/492d2d30-e201-405e-97aa-5457a763411d-kube-api-access-bqtck\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.519404 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/492d2d30-e201-405e-97aa-5457a763411d-apiservice-cert\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.519791 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/492d2d30-e201-405e-97aa-5457a763411d-webhook-cert\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.519939 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqtck\" (UniqueName: \"kubernetes.io/projected/492d2d30-e201-405e-97aa-5457a763411d-kube-api-access-bqtck\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.525647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/492d2d30-e201-405e-97aa-5457a763411d-webhook-cert\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.528001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/492d2d30-e201-405e-97aa-5457a763411d-apiservice-cert\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.542134 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqtck\" (UniqueName: \"kubernetes.io/projected/492d2d30-e201-405e-97aa-5457a763411d-kube-api-access-bqtck\") pod \"metallb-operator-controller-manager-568b68fc89-tcglq\" (UID: \"492d2d30-e201-405e-97aa-5457a763411d\") " pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.596134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.705901 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd"] Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.706777 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.710072 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.710306 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6xltn" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.710456 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.755002 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd"] Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.828209 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-webhook-cert\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.828266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwj5s\" (UniqueName: \"kubernetes.io/projected/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-kube-api-access-kwj5s\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.828296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-apiservice-cert\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.929753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-webhook-cert\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.929792 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwj5s\" (UniqueName: \"kubernetes.io/projected/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-kube-api-access-kwj5s\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.929816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-apiservice-cert\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.935713 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-webhook-cert\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.937081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-apiservice-cert\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:45 crc kubenswrapper[4735]: I1008 13:09:45.948166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwj5s\" (UniqueName: \"kubernetes.io/projected/6646229c-d6d5-41ac-bcc2-22fcb3b2c737-kube-api-access-kwj5s\") pod \"metallb-operator-webhook-server-85bdc78dfc-4j6pd\" (UID: \"6646229c-d6d5-41ac-bcc2-22fcb3b2c737\") " pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:46 crc kubenswrapper[4735]: I1008 13:09:46.051420 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:46 crc kubenswrapper[4735]: I1008 13:09:46.158798 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq"] Oct 08 13:09:46 crc kubenswrapper[4735]: W1008 13:09:46.171413 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod492d2d30_e201_405e_97aa_5457a763411d.slice/crio-fbea3c4f53c2130e926c8cc1850e88361225524abb6ed795e5d92d56859fa0e0 WatchSource:0}: Error finding container fbea3c4f53c2130e926c8cc1850e88361225524abb6ed795e5d92d56859fa0e0: Status 404 returned error can't find the container with id fbea3c4f53c2130e926c8cc1850e88361225524abb6ed795e5d92d56859fa0e0 Oct 08 13:09:46 crc kubenswrapper[4735]: I1008 13:09:46.456764 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd"] Oct 08 13:09:46 crc kubenswrapper[4735]: W1008 13:09:46.458502 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6646229c_d6d5_41ac_bcc2_22fcb3b2c737.slice/crio-71f644b5401743993d96bebd6f35d7f51065b1fb5d31a7e6f6a370dc32abc317 WatchSource:0}: Error finding container 71f644b5401743993d96bebd6f35d7f51065b1fb5d31a7e6f6a370dc32abc317: Status 404 returned error can't find the container with id 71f644b5401743993d96bebd6f35d7f51065b1fb5d31a7e6f6a370dc32abc317 Oct 08 13:09:46 crc kubenswrapper[4735]: I1008 13:09:46.633634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" event={"ID":"492d2d30-e201-405e-97aa-5457a763411d","Type":"ContainerStarted","Data":"fbea3c4f53c2130e926c8cc1850e88361225524abb6ed795e5d92d56859fa0e0"} Oct 08 13:09:46 crc kubenswrapper[4735]: I1008 13:09:46.634671 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" event={"ID":"6646229c-d6d5-41ac-bcc2-22fcb3b2c737","Type":"ContainerStarted","Data":"71f644b5401743993d96bebd6f35d7f51065b1fb5d31a7e6f6a370dc32abc317"} Oct 08 13:09:52 crc kubenswrapper[4735]: I1008 13:09:52.679930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" event={"ID":"6646229c-d6d5-41ac-bcc2-22fcb3b2c737","Type":"ContainerStarted","Data":"f68f16b5c2ddad117dd3b7beb15d3a015f3949448e9b90a9e9a357c72967d656"} Oct 08 13:09:52 crc kubenswrapper[4735]: I1008 13:09:52.680470 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:09:52 crc kubenswrapper[4735]: I1008 13:09:52.708519 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" podStartSLOduration=2.2343369969999998 podStartE2EDuration="7.7085s" podCreationTimestamp="2025-10-08 13:09:45 +0000 UTC" firstStartedPulling="2025-10-08 13:09:46.461510212 +0000 UTC m=+836.269814756" lastFinishedPulling="2025-10-08 13:09:51.935673225 +0000 UTC m=+841.743977759" observedRunningTime="2025-10-08 13:09:52.70701068 +0000 UTC m=+842.515315224" watchObservedRunningTime="2025-10-08 13:09:52.7085 +0000 UTC m=+842.516804554" Oct 08 13:09:54 crc kubenswrapper[4735]: I1008 13:09:54.878934 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:09:54 crc kubenswrapper[4735]: I1008 13:09:54.879002 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:09:54 crc kubenswrapper[4735]: I1008 13:09:54.879057 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:09:54 crc kubenswrapper[4735]: I1008 13:09:54.879727 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33eef7770cb538234e6d07e0c65f9624f3726c56e48ea2376d339214ceee54ee"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:09:54 crc kubenswrapper[4735]: I1008 13:09:54.879791 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://33eef7770cb538234e6d07e0c65f9624f3726c56e48ea2376d339214ceee54ee" gracePeriod=600 Oct 08 13:09:55 crc kubenswrapper[4735]: I1008 13:09:55.717528 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="33eef7770cb538234e6d07e0c65f9624f3726c56e48ea2376d339214ceee54ee" exitCode=0 Oct 08 13:09:55 crc kubenswrapper[4735]: I1008 13:09:55.717601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"33eef7770cb538234e6d07e0c65f9624f3726c56e48ea2376d339214ceee54ee"} Oct 08 13:09:55 crc kubenswrapper[4735]: I1008 13:09:55.717998 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"933a14f7e22905eb9791da68c8e9bb68f5475b7d23363338a177f03a39161d6c"} Oct 08 13:09:55 crc kubenswrapper[4735]: I1008 13:09:55.718019 4735 scope.go:117] "RemoveContainer" containerID="173789e0d046afd36eed4db36bb52d30421c87957b91d798bdb18e553a36d54d" Oct 08 13:10:00 crc kubenswrapper[4735]: I1008 13:10:00.751250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" event={"ID":"492d2d30-e201-405e-97aa-5457a763411d","Type":"ContainerStarted","Data":"5e50e2be163e91eb3d9550f523e1e4544a744ef5ba43ee3626977d5f0be121c1"} Oct 08 13:10:00 crc kubenswrapper[4735]: I1008 13:10:00.752695 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:10:00 crc kubenswrapper[4735]: I1008 13:10:00.776087 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" podStartSLOduration=2.305545585 podStartE2EDuration="15.776066807s" podCreationTimestamp="2025-10-08 13:09:45 +0000 UTC" firstStartedPulling="2025-10-08 13:09:46.197820519 +0000 UTC m=+836.006125073" lastFinishedPulling="2025-10-08 13:09:59.668341761 +0000 UTC m=+849.476646295" observedRunningTime="2025-10-08 13:10:00.773562349 +0000 UTC m=+850.581866903" watchObservedRunningTime="2025-10-08 13:10:00.776066807 +0000 UTC m=+850.584371351" Oct 08 13:10:06 crc kubenswrapper[4735]: I1008 13:10:06.055869 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-85bdc78dfc-4j6pd" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.541588 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.543578 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.561661 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.658512 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-catalog-content\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.658807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpx2t\" (UniqueName: \"kubernetes.io/projected/96042caa-f4e2-4a05-af44-8883a06394de-kube-api-access-mpx2t\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.658907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-utilities\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.759908 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpx2t\" (UniqueName: \"kubernetes.io/projected/96042caa-f4e2-4a05-af44-8883a06394de-kube-api-access-mpx2t\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.760319 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-utilities\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.760536 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-catalog-content\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.760830 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-utilities\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.760994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-catalog-content\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.785593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpx2t\" (UniqueName: \"kubernetes.io/projected/96042caa-f4e2-4a05-af44-8883a06394de-kube-api-access-mpx2t\") pod \"certified-operators-48k4v\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:25 crc kubenswrapper[4735]: I1008 13:10:25.870327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:26 crc kubenswrapper[4735]: I1008 13:10:26.126964 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:10:26 crc kubenswrapper[4735]: I1008 13:10:26.884598 4735 generic.go:334] "Generic (PLEG): container finished" podID="96042caa-f4e2-4a05-af44-8883a06394de" containerID="20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5" exitCode=0 Oct 08 13:10:26 crc kubenswrapper[4735]: I1008 13:10:26.884658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48k4v" event={"ID":"96042caa-f4e2-4a05-af44-8883a06394de","Type":"ContainerDied","Data":"20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5"} Oct 08 13:10:26 crc kubenswrapper[4735]: I1008 13:10:26.884708 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48k4v" event={"ID":"96042caa-f4e2-4a05-af44-8883a06394de","Type":"ContainerStarted","Data":"513d3f5f811d1f23b5625c987b7069132080d95f8d260cc3d04ba9f60c3f7a03"} Oct 08 13:10:30 crc kubenswrapper[4735]: I1008 13:10:30.906115 4735 generic.go:334] "Generic (PLEG): container finished" podID="96042caa-f4e2-4a05-af44-8883a06394de" containerID="705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c" exitCode=0 Oct 08 13:10:30 crc kubenswrapper[4735]: I1008 13:10:30.906208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48k4v" event={"ID":"96042caa-f4e2-4a05-af44-8883a06394de","Type":"ContainerDied","Data":"705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c"} Oct 08 13:10:32 crc kubenswrapper[4735]: I1008 13:10:32.918582 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48k4v" event={"ID":"96042caa-f4e2-4a05-af44-8883a06394de","Type":"ContainerStarted","Data":"79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a"} Oct 08 13:10:32 crc kubenswrapper[4735]: I1008 13:10:32.939918 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-48k4v" podStartSLOduration=2.980641284 podStartE2EDuration="7.939897958s" podCreationTimestamp="2025-10-08 13:10:25 +0000 UTC" firstStartedPulling="2025-10-08 13:10:26.886422646 +0000 UTC m=+876.694727190" lastFinishedPulling="2025-10-08 13:10:31.84567932 +0000 UTC m=+881.653983864" observedRunningTime="2025-10-08 13:10:32.936766393 +0000 UTC m=+882.745070937" watchObservedRunningTime="2025-10-08 13:10:32.939897958 +0000 UTC m=+882.748202502" Oct 08 13:10:35 crc kubenswrapper[4735]: I1008 13:10:35.600881 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-568b68fc89-tcglq" Oct 08 13:10:35 crc kubenswrapper[4735]: I1008 13:10:35.871112 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:35 crc kubenswrapper[4735]: I1008 13:10:35.871176 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:35 crc kubenswrapper[4735]: I1008 13:10:35.915275 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.273870 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h2xst"] Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.275968 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.278028 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.278142 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-42bt9" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.278696 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.286805 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw"] Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.287621 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.291131 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.296400 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw"] Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.364099 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cf2dv"] Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.365054 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.366451 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.366931 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.368652 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.368954 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6s9ml" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.404899 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-6r8bv"] Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.406075 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lxvg\" (UniqueName: \"kubernetes.io/projected/89ab93e0-026b-4e98-a3d4-137f0d56accd-kube-api-access-7lxvg\") pod \"frr-k8s-webhook-server-64bf5d555-frhpw\" (UID: \"89ab93e0-026b-4e98-a3d4-137f0d56accd\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407074 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-reloader\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-sockets\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-conf\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics-certs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ab93e0-026b-4e98-a3d4-137f0d56accd-cert\") pod \"frr-k8s-webhook-server-64bf5d555-frhpw\" (UID: \"89ab93e0-026b-4e98-a3d4-137f0d56accd\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-startup\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407651 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2fs\" (UniqueName: \"kubernetes.io/projected/da1954e4-438d-4fb3-a435-1fbf29da5382-kube-api-access-4z2fs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.407851 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.424635 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-6r8bv"] Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509243 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-sockets\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metallb-excludel2\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-conf\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics-certs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-cert\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.509553 4735 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.509620 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics-certs podName:da1954e4-438d-4fb3-a435-1fbf29da5382 nodeName:}" failed. No retries permitted until 2025-10-08 13:10:37.009599751 +0000 UTC m=+886.817904305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics-certs") pod "frr-k8s-h2xst" (UID: "da1954e4-438d-4fb3-a435-1fbf29da5382") : secret "frr-k8s-certs-secret" not found Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5w26\" (UniqueName: \"kubernetes.io/projected/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-kube-api-access-s5w26\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509771 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxjn8\" (UniqueName: \"kubernetes.io/projected/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-kube-api-access-wxjn8\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-sockets\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509874 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-conf\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.509931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metrics-certs\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ab93e0-026b-4e98-a3d4-137f0d56accd-cert\") pod \"frr-k8s-webhook-server-64bf5d555-frhpw\" (UID: \"89ab93e0-026b-4e98-a3d4-137f0d56accd\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510425 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510495 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-startup\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510527 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2fs\" (UniqueName: \"kubernetes.io/projected/da1954e4-438d-4fb3-a435-1fbf29da5382-kube-api-access-4z2fs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-metrics-certs\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lxvg\" (UniqueName: \"kubernetes.io/projected/89ab93e0-026b-4e98-a3d4-137f0d56accd-kube-api-access-7lxvg\") pod \"frr-k8s-webhook-server-64bf5d555-frhpw\" (UID: \"89ab93e0-026b-4e98-a3d4-137f0d56accd\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-reloader\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.510953 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.511177 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/da1954e4-438d-4fb3-a435-1fbf29da5382-reloader\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.511899 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/da1954e4-438d-4fb3-a435-1fbf29da5382-frr-startup\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.528511 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ab93e0-026b-4e98-a3d4-137f0d56accd-cert\") pod \"frr-k8s-webhook-server-64bf5d555-frhpw\" (UID: \"89ab93e0-026b-4e98-a3d4-137f0d56accd\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.528909 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lxvg\" (UniqueName: \"kubernetes.io/projected/89ab93e0-026b-4e98-a3d4-137f0d56accd-kube-api-access-7lxvg\") pod \"frr-k8s-webhook-server-64bf5d555-frhpw\" (UID: \"89ab93e0-026b-4e98-a3d4-137f0d56accd\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.528991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2fs\" (UniqueName: \"kubernetes.io/projected/da1954e4-438d-4fb3-a435-1fbf29da5382-kube-api-access-4z2fs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.606828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615376 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-metrics-certs\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615499 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metallb-excludel2\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-cert\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.615588 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.615656 4735 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.615679 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist podName:ca9e15c8-5fa1-4b84-a47b-4662865b29ff nodeName:}" failed. No retries permitted until 2025-10-08 13:10:37.115659226 +0000 UTC m=+886.923963770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist") pod "speaker-cf2dv" (UID: "ca9e15c8-5fa1-4b84-a47b-4662865b29ff") : secret "metallb-memberlist" not found Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5w26\" (UniqueName: \"kubernetes.io/projected/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-kube-api-access-s5w26\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.615714 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-metrics-certs podName:54d730e4-2333-42b5-ac3f-fd36eaccf6c6 nodeName:}" failed. No retries permitted until 2025-10-08 13:10:37.115696957 +0000 UTC m=+886.924001501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-metrics-certs") pod "controller-68d546b9d8-6r8bv" (UID: "54d730e4-2333-42b5-ac3f-fd36eaccf6c6") : secret "controller-certs-secret" not found Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxjn8\" (UniqueName: \"kubernetes.io/projected/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-kube-api-access-wxjn8\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.615765 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metrics-certs\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.615843 4735 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 08 13:10:36 crc kubenswrapper[4735]: E1008 13:10:36.615863 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metrics-certs podName:ca9e15c8-5fa1-4b84-a47b-4662865b29ff nodeName:}" failed. No retries permitted until 2025-10-08 13:10:37.115857372 +0000 UTC m=+886.924161916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metrics-certs") pod "speaker-cf2dv" (UID: "ca9e15c8-5fa1-4b84-a47b-4662865b29ff") : secret "speaker-certs-secret" not found Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.616463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metallb-excludel2\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.623784 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.631957 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-cert\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.645538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5w26\" (UniqueName: \"kubernetes.io/projected/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-kube-api-access-s5w26\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:36 crc kubenswrapper[4735]: I1008 13:10:36.646195 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxjn8\" (UniqueName: \"kubernetes.io/projected/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-kube-api-access-wxjn8\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.009234 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw"] Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.023119 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics-certs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.029382 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da1954e4-438d-4fb3-a435-1fbf29da5382-metrics-certs\") pod \"frr-k8s-h2xst\" (UID: \"da1954e4-438d-4fb3-a435-1fbf29da5382\") " pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.124237 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metrics-certs\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.124289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.124317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-metrics-certs\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:37 crc kubenswrapper[4735]: E1008 13:10:37.124746 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 13:10:37 crc kubenswrapper[4735]: E1008 13:10:37.124840 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist podName:ca9e15c8-5fa1-4b84-a47b-4662865b29ff nodeName:}" failed. No retries permitted until 2025-10-08 13:10:38.124818069 +0000 UTC m=+887.933122623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist") pod "speaker-cf2dv" (UID: "ca9e15c8-5fa1-4b84-a47b-4662865b29ff") : secret "metallb-memberlist" not found Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.128642 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-metrics-certs\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.128972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54d730e4-2333-42b5-ac3f-fd36eaccf6c6-metrics-certs\") pod \"controller-68d546b9d8-6r8bv\" (UID: \"54d730e4-2333-42b5-ac3f-fd36eaccf6c6\") " pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.197769 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.318922 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.596860 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-6r8bv"] Oct 08 13:10:37 crc kubenswrapper[4735]: W1008 13:10:37.604227 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d730e4_2333_42b5_ac3f_fd36eaccf6c6.slice/crio-bf8a78ce511b9974a08fc638b9dbf6237f86844dac0962d54869da707c662a95 WatchSource:0}: Error finding container bf8a78ce511b9974a08fc638b9dbf6237f86844dac0962d54869da707c662a95: Status 404 returned error can't find the container with id bf8a78ce511b9974a08fc638b9dbf6237f86844dac0962d54869da707c662a95 Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.952572 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"16d45d01d7ca875110516edb0ebbd1b09d3e1636c8e9ecec4d81aed7a97eb30d"} Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.954388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6r8bv" event={"ID":"54d730e4-2333-42b5-ac3f-fd36eaccf6c6","Type":"ContainerStarted","Data":"56a8e30ff1230c8b43ce96104d67c5618bf36e7bf0cf1e9dd2eb856c1598157e"} Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.954432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6r8bv" event={"ID":"54d730e4-2333-42b5-ac3f-fd36eaccf6c6","Type":"ContainerStarted","Data":"a351b8002c66a712afab3f493e3aa3a7bdb10de987c56ba4f673a022c97085c4"} Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.954443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6r8bv" event={"ID":"54d730e4-2333-42b5-ac3f-fd36eaccf6c6","Type":"ContainerStarted","Data":"bf8a78ce511b9974a08fc638b9dbf6237f86844dac0962d54869da707c662a95"} Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.954534 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.955724 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" event={"ID":"89ab93e0-026b-4e98-a3d4-137f0d56accd","Type":"ContainerStarted","Data":"67d1b0e2858695a2b97b9bbdb961b67104ec0e54729627e56d7e7acfd3014f17"} Oct 08 13:10:37 crc kubenswrapper[4735]: I1008 13:10:37.971287 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-6r8bv" podStartSLOduration=1.971272675 podStartE2EDuration="1.971272675s" podCreationTimestamp="2025-10-08 13:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:10:37.969575229 +0000 UTC m=+887.777879773" watchObservedRunningTime="2025-10-08 13:10:37.971272675 +0000 UTC m=+887.779577219" Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.146183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.150585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca9e15c8-5fa1-4b84-a47b-4662865b29ff-memberlist\") pod \"speaker-cf2dv\" (UID: \"ca9e15c8-5fa1-4b84-a47b-4662865b29ff\") " pod="metallb-system/speaker-cf2dv" Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.178434 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cf2dv" Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.962949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cf2dv" event={"ID":"ca9e15c8-5fa1-4b84-a47b-4662865b29ff","Type":"ContainerStarted","Data":"cdcf511b80e004c0df04995de2416788736b81a20ea63d4b4b05e4c17b25e97a"} Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.963266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cf2dv" event={"ID":"ca9e15c8-5fa1-4b84-a47b-4662865b29ff","Type":"ContainerStarted","Data":"752b513c7a79914a055137bb83e3f8ea28e69d3c6bfda205b2ea10c7de2610ce"} Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.963292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cf2dv" event={"ID":"ca9e15c8-5fa1-4b84-a47b-4662865b29ff","Type":"ContainerStarted","Data":"2d3f1d4e0b8bd1ef35b6861eced1582dad107503d304d6cd2205e2561a38cad3"} Oct 08 13:10:38 crc kubenswrapper[4735]: I1008 13:10:38.963567 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cf2dv" Oct 08 13:10:40 crc kubenswrapper[4735]: I1008 13:10:40.954463 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cf2dv" podStartSLOduration=4.954445552 podStartE2EDuration="4.954445552s" podCreationTimestamp="2025-10-08 13:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:10:38.9940503 +0000 UTC m=+888.802354854" watchObservedRunningTime="2025-10-08 13:10:40.954445552 +0000 UTC m=+890.762750096" Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.005390 4735 generic.go:334] "Generic (PLEG): container finished" podID="da1954e4-438d-4fb3-a435-1fbf29da5382" containerID="0d9012e032f57e711e9e0d7a0d563bd59b6cb61c82680cae38d79c710bd49da3" exitCode=0 Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.005455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerDied","Data":"0d9012e032f57e711e9e0d7a0d563bd59b6cb61c82680cae38d79c710bd49da3"} Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.011537 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" event={"ID":"89ab93e0-026b-4e98-a3d4-137f0d56accd","Type":"ContainerStarted","Data":"da79b5b8600b288698259fc2c361d219e1a5157592f5979ef0a3a1674a29aff7"} Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.011750 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.052222 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" podStartSLOduration=1.26422963 podStartE2EDuration="9.05220586s" podCreationTimestamp="2025-10-08 13:10:36 +0000 UTC" firstStartedPulling="2025-10-08 13:10:37.014849627 +0000 UTC m=+886.823154171" lastFinishedPulling="2025-10-08 13:10:44.802825857 +0000 UTC m=+894.611130401" observedRunningTime="2025-10-08 13:10:45.047858382 +0000 UTC m=+894.856162966" watchObservedRunningTime="2025-10-08 13:10:45.05220586 +0000 UTC m=+894.860510404" Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.917559 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:10:45 crc kubenswrapper[4735]: I1008 13:10:45.999791 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:10:46 crc kubenswrapper[4735]: I1008 13:10:46.023042 4735 generic.go:334] "Generic (PLEG): container finished" podID="da1954e4-438d-4fb3-a435-1fbf29da5382" containerID="f5c77a9be8fb6c616ea17947dc6b9ec959c3523caf96e98fc194d4c038ae5b37" exitCode=0 Oct 08 13:10:46 crc kubenswrapper[4735]: I1008 13:10:46.023110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerDied","Data":"f5c77a9be8fb6c616ea17947dc6b9ec959c3523caf96e98fc194d4c038ae5b37"} Oct 08 13:10:46 crc kubenswrapper[4735]: I1008 13:10:46.050606 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t9ktd"] Oct 08 13:10:46 crc kubenswrapper[4735]: I1008 13:10:46.050891 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t9ktd" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="registry-server" containerID="cri-o://592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904" gracePeriod=2 Oct 08 13:10:46 crc kubenswrapper[4735]: I1008 13:10:46.945447 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.033197 4735 generic.go:334] "Generic (PLEG): container finished" podID="da1954e4-438d-4fb3-a435-1fbf29da5382" containerID="02f1ba4456371981dae4b8d456f7bf518aa5d8fa1d5fe70b9194897c8934e27d" exitCode=0 Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.033294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerDied","Data":"02f1ba4456371981dae4b8d456f7bf518aa5d8fa1d5fe70b9194897c8934e27d"} Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.036321 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerID="592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904" exitCode=0 Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.036359 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9ktd" event={"ID":"8a2c0c5f-5bfa-40ba-87da-87d1e498abda","Type":"ContainerDied","Data":"592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904"} Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.036383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9ktd" event={"ID":"8a2c0c5f-5bfa-40ba-87da-87d1e498abda","Type":"ContainerDied","Data":"c3d0f029ac12a74b3c615071b746d7ae44fa57d327c3e7829027cfe5695319b3"} Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.036387 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9ktd" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.036403 4735 scope.go:117] "RemoveContainer" containerID="592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.070164 4735 scope.go:117] "RemoveContainer" containerID="b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.071060 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-utilities\") pod \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.071142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-catalog-content\") pod \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.071241 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7jjc\" (UniqueName: \"kubernetes.io/projected/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-kube-api-access-r7jjc\") pod \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\" (UID: \"8a2c0c5f-5bfa-40ba-87da-87d1e498abda\") " Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.072015 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-utilities" (OuterVolumeSpecName: "utilities") pod "8a2c0c5f-5bfa-40ba-87da-87d1e498abda" (UID: "8a2c0c5f-5bfa-40ba-87da-87d1e498abda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.072689 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.077607 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-kube-api-access-r7jjc" (OuterVolumeSpecName: "kube-api-access-r7jjc") pod "8a2c0c5f-5bfa-40ba-87da-87d1e498abda" (UID: "8a2c0c5f-5bfa-40ba-87da-87d1e498abda"). InnerVolumeSpecName "kube-api-access-r7jjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.119534 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a2c0c5f-5bfa-40ba-87da-87d1e498abda" (UID: "8a2c0c5f-5bfa-40ba-87da-87d1e498abda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.133733 4735 scope.go:117] "RemoveContainer" containerID="60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173104 4735 scope.go:117] "RemoveContainer" containerID="592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904" Oct 08 13:10:47 crc kubenswrapper[4735]: E1008 13:10:47.173458 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904\": container with ID starting with 592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904 not found: ID does not exist" containerID="592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173510 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904"} err="failed to get container status \"592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904\": rpc error: code = NotFound desc = could not find container \"592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904\": container with ID starting with 592974a881a920f93fc2ae822a78e91f2ed80d26102fc3161c898578ca8b1904 not found: ID does not exist" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173535 4735 scope.go:117] "RemoveContainer" containerID="b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173596 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173629 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7jjc\" (UniqueName: \"kubernetes.io/projected/8a2c0c5f-5bfa-40ba-87da-87d1e498abda-kube-api-access-r7jjc\") on node \"crc\" DevicePath \"\"" Oct 08 13:10:47 crc kubenswrapper[4735]: E1008 13:10:47.173770 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80\": container with ID starting with b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80 not found: ID does not exist" containerID="b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173802 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80"} err="failed to get container status \"b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80\": rpc error: code = NotFound desc = could not find container \"b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80\": container with ID starting with b4251c754819e270814867e78454c31f92c7e27895f6e8a14e22ea30600a6b80 not found: ID does not exist" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.173823 4735 scope.go:117] "RemoveContainer" containerID="60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3" Oct 08 13:10:47 crc kubenswrapper[4735]: E1008 13:10:47.174155 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3\": container with ID starting with 60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3 not found: ID does not exist" containerID="60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.174203 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3"} err="failed to get container status \"60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3\": rpc error: code = NotFound desc = could not find container \"60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3\": container with ID starting with 60ed4ab1066b008b245ddaa2c00335651f7a9c3ff7c6ba0597e2daa1e41a6ea3 not found: ID does not exist" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.323990 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-6r8bv" Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.362361 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t9ktd"] Oct 08 13:10:47 crc kubenswrapper[4735]: I1008 13:10:47.366900 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t9ktd"] Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.044691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"241050c8e32889d3ae766bc319ab9400ee60954ec4c0819e15f3cd2c3cbfe978"} Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.045016 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"a449939b28bfa3acc59fb1a1c9eb1c24ccf444f34f97bb1a006c332f18ea985a"} Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.183619 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cf2dv" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.763943 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vgqxz"] Oct 08 13:10:48 crc kubenswrapper[4735]: E1008 13:10:48.764153 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="registry-server" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.764164 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="registry-server" Oct 08 13:10:48 crc kubenswrapper[4735]: E1008 13:10:48.764183 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="extract-utilities" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.764189 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="extract-utilities" Oct 08 13:10:48 crc kubenswrapper[4735]: E1008 13:10:48.764203 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="extract-content" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.764209 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="extract-content" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.764303 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" containerName="registry-server" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.765016 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.777461 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgqxz"] Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.797160 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-catalog-content\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.797213 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-utilities\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.797240 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrd7\" (UniqueName: \"kubernetes.io/projected/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-kube-api-access-9mrd7\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.899359 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-utilities\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.899484 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrd7\" (UniqueName: \"kubernetes.io/projected/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-kube-api-access-9mrd7\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.899767 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-catalog-content\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.900629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-utilities\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.900738 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-catalog-content\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.909163 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2c0c5f-5bfa-40ba-87da-87d1e498abda" path="/var/lib/kubelet/pods/8a2c0c5f-5bfa-40ba-87da-87d1e498abda/volumes" Oct 08 13:10:48 crc kubenswrapper[4735]: I1008 13:10:48.939306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrd7\" (UniqueName: \"kubernetes.io/projected/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-kube-api-access-9mrd7\") pod \"community-operators-vgqxz\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:49 crc kubenswrapper[4735]: I1008 13:10:49.061041 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"17ca6ce2178125b2f97c546094277a22c7da9d4610fdb821b3fd55bfe8f1f2ba"} Oct 08 13:10:49 crc kubenswrapper[4735]: I1008 13:10:49.061885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"d54d77a073510f6f6819664de8c2dde5712be8d5615dbeafb4703d049ee7137f"} Oct 08 13:10:49 crc kubenswrapper[4735]: I1008 13:10:49.061961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"839c136ad352a232568bec3edb929e27a5ab052f3d1943c56f619436a6e37e3f"} Oct 08 13:10:49 crc kubenswrapper[4735]: I1008 13:10:49.090318 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:49 crc kubenswrapper[4735]: I1008 13:10:49.411953 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgqxz"] Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.071372 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2xst" event={"ID":"da1954e4-438d-4fb3-a435-1fbf29da5382","Type":"ContainerStarted","Data":"9f407c6be271e484460c06634612c6fcf8d611a31e71d5082caf415bbf581e05"} Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.071746 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.072915 4735 generic.go:334] "Generic (PLEG): container finished" podID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerID="e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150" exitCode=0 Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.072949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerDied","Data":"e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150"} Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.073003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerStarted","Data":"5a24d5c3cce387970f7f1ba88c3ca0e7d8c810af4cf68d0de9c30d36fd4d12d1"} Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.103740 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h2xst" podStartSLOduration=6.61432018 podStartE2EDuration="14.103725807s" podCreationTimestamp="2025-10-08 13:10:36 +0000 UTC" firstStartedPulling="2025-10-08 13:10:37.333908227 +0000 UTC m=+887.142212761" lastFinishedPulling="2025-10-08 13:10:44.823313834 +0000 UTC m=+894.631618388" observedRunningTime="2025-10-08 13:10:50.102229776 +0000 UTC m=+899.910534320" watchObservedRunningTime="2025-10-08 13:10:50.103725807 +0000 UTC m=+899.912030351" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.407564 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m"] Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.408854 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.411565 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.429691 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m"] Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.523548 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.523642 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpxt\" (UniqueName: \"kubernetes.io/projected/e3eae11a-5682-4826-9eca-ebc268c2114b-kube-api-access-jfpxt\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.523716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.624562 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpxt\" (UniqueName: \"kubernetes.io/projected/e3eae11a-5682-4826-9eca-ebc268c2114b-kube-api-access-jfpxt\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.624685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.625341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.625703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.625748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.677043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpxt\" (UniqueName: \"kubernetes.io/projected/e3eae11a-5682-4826-9eca-ebc268c2114b-kube-api-access-jfpxt\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:50 crc kubenswrapper[4735]: I1008 13:10:50.724242 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:51 crc kubenswrapper[4735]: I1008 13:10:51.081265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerStarted","Data":"87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d"} Oct 08 13:10:51 crc kubenswrapper[4735]: I1008 13:10:51.163291 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m"] Oct 08 13:10:51 crc kubenswrapper[4735]: E1008 13:10:51.241851 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df03f05_01ff_4f82_8859_6d9fd0bd0dd7.slice/crio-87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d.scope\": RecentStats: unable to find data in memory cache]" Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.088774 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerID="4d8dcd605696a9d0ffd2c647153bc8079f938f0678907801bb2e333618cb60c5" exitCode=0 Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.088856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" event={"ID":"e3eae11a-5682-4826-9eca-ebc268c2114b","Type":"ContainerDied","Data":"4d8dcd605696a9d0ffd2c647153bc8079f938f0678907801bb2e333618cb60c5"} Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.088887 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" event={"ID":"e3eae11a-5682-4826-9eca-ebc268c2114b","Type":"ContainerStarted","Data":"cb4c242f7a51b42e6bf4f99471c8d3266d3add1f482c68c03695b4dbdabf078f"} Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.092323 4735 generic.go:334] "Generic (PLEG): container finished" podID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerID="87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d" exitCode=0 Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.092366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerDied","Data":"87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d"} Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.198205 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:52 crc kubenswrapper[4735]: I1008 13:10:52.234003 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:53 crc kubenswrapper[4735]: I1008 13:10:53.102010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerStarted","Data":"855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5"} Oct 08 13:10:53 crc kubenswrapper[4735]: I1008 13:10:53.143656 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vgqxz" podStartSLOduration=2.542217446 podStartE2EDuration="5.143640387s" podCreationTimestamp="2025-10-08 13:10:48 +0000 UTC" firstStartedPulling="2025-10-08 13:10:50.074682796 +0000 UTC m=+899.882987340" lastFinishedPulling="2025-10-08 13:10:52.676105737 +0000 UTC m=+902.484410281" observedRunningTime="2025-10-08 13:10:53.140792309 +0000 UTC m=+902.949096853" watchObservedRunningTime="2025-10-08 13:10:53.143640387 +0000 UTC m=+902.951944931" Oct 08 13:10:55 crc kubenswrapper[4735]: I1008 13:10:55.114850 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerID="b714a32b227adc2721cd03c1634a99ba21a59df892129d7b5e39d84883dccdb0" exitCode=0 Oct 08 13:10:55 crc kubenswrapper[4735]: I1008 13:10:55.114927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" event={"ID":"e3eae11a-5682-4826-9eca-ebc268c2114b","Type":"ContainerDied","Data":"b714a32b227adc2721cd03c1634a99ba21a59df892129d7b5e39d84883dccdb0"} Oct 08 13:10:56 crc kubenswrapper[4735]: I1008 13:10:56.122994 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerID="595beb10ec781a1b18ab92ad6b4c67e9a4023237e42af38c595dd4a14206ad9e" exitCode=0 Oct 08 13:10:56 crc kubenswrapper[4735]: I1008 13:10:56.123068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" event={"ID":"e3eae11a-5682-4826-9eca-ebc268c2114b","Type":"ContainerDied","Data":"595beb10ec781a1b18ab92ad6b4c67e9a4023237e42af38c595dd4a14206ad9e"} Oct 08 13:10:56 crc kubenswrapper[4735]: I1008 13:10:56.611320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-frhpw" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.209234 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h2xst" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.343778 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.530684 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-util\") pod \"e3eae11a-5682-4826-9eca-ebc268c2114b\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.530783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfpxt\" (UniqueName: \"kubernetes.io/projected/e3eae11a-5682-4826-9eca-ebc268c2114b-kube-api-access-jfpxt\") pod \"e3eae11a-5682-4826-9eca-ebc268c2114b\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.530857 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-bundle\") pod \"e3eae11a-5682-4826-9eca-ebc268c2114b\" (UID: \"e3eae11a-5682-4826-9eca-ebc268c2114b\") " Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.532005 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-bundle" (OuterVolumeSpecName: "bundle") pod "e3eae11a-5682-4826-9eca-ebc268c2114b" (UID: "e3eae11a-5682-4826-9eca-ebc268c2114b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.537050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3eae11a-5682-4826-9eca-ebc268c2114b-kube-api-access-jfpxt" (OuterVolumeSpecName: "kube-api-access-jfpxt") pod "e3eae11a-5682-4826-9eca-ebc268c2114b" (UID: "e3eae11a-5682-4826-9eca-ebc268c2114b"). InnerVolumeSpecName "kube-api-access-jfpxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.541381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-util" (OuterVolumeSpecName: "util") pod "e3eae11a-5682-4826-9eca-ebc268c2114b" (UID: "e3eae11a-5682-4826-9eca-ebc268c2114b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.632110 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfpxt\" (UniqueName: \"kubernetes.io/projected/e3eae11a-5682-4826-9eca-ebc268c2114b-kube-api-access-jfpxt\") on node \"crc\" DevicePath \"\"" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.632141 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:10:57 crc kubenswrapper[4735]: I1008 13:10:57.632149 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3eae11a-5682-4826-9eca-ebc268c2114b-util\") on node \"crc\" DevicePath \"\"" Oct 08 13:10:58 crc kubenswrapper[4735]: I1008 13:10:58.138103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" event={"ID":"e3eae11a-5682-4826-9eca-ebc268c2114b","Type":"ContainerDied","Data":"cb4c242f7a51b42e6bf4f99471c8d3266d3add1f482c68c03695b4dbdabf078f"} Oct 08 13:10:58 crc kubenswrapper[4735]: I1008 13:10:58.138158 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb4c242f7a51b42e6bf4f99471c8d3266d3add1f482c68c03695b4dbdabf078f" Oct 08 13:10:58 crc kubenswrapper[4735]: I1008 13:10:58.138154 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m" Oct 08 13:10:59 crc kubenswrapper[4735]: I1008 13:10:59.090791 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:59 crc kubenswrapper[4735]: I1008 13:10:59.090853 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:59 crc kubenswrapper[4735]: I1008 13:10:59.174934 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:10:59 crc kubenswrapper[4735]: I1008 13:10:59.294369 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.299748 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74"] Oct 08 13:11:00 crc kubenswrapper[4735]: E1008 13:11:00.299953 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="pull" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.299963 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="pull" Oct 08 13:11:00 crc kubenswrapper[4735]: E1008 13:11:00.299977 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="extract" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.299983 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="extract" Oct 08 13:11:00 crc kubenswrapper[4735]: E1008 13:11:00.299993 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="util" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.299998 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="util" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.300121 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3eae11a-5682-4826-9eca-ebc268c2114b" containerName="extract" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.300624 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.302876 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-bqk5g" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.304393 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.304606 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.322138 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74"] Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.377923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9qs\" (UniqueName: \"kubernetes.io/projected/c40ac050-19b6-4d01-b8c7-a9918a4c0b44-kube-api-access-2l9qs\") pod \"cert-manager-operator-controller-manager-57cd46d6d-rjb74\" (UID: \"c40ac050-19b6-4d01-b8c7-a9918a4c0b44\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.479116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9qs\" (UniqueName: \"kubernetes.io/projected/c40ac050-19b6-4d01-b8c7-a9918a4c0b44-kube-api-access-2l9qs\") pod \"cert-manager-operator-controller-manager-57cd46d6d-rjb74\" (UID: \"c40ac050-19b6-4d01-b8c7-a9918a4c0b44\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.502047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9qs\" (UniqueName: \"kubernetes.io/projected/c40ac050-19b6-4d01-b8c7-a9918a4c0b44-kube-api-access-2l9qs\") pod \"cert-manager-operator-controller-manager-57cd46d6d-rjb74\" (UID: \"c40ac050-19b6-4d01-b8c7-a9918a4c0b44\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.616123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" Oct 08 13:11:00 crc kubenswrapper[4735]: I1008 13:11:00.824270 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74"] Oct 08 13:11:00 crc kubenswrapper[4735]: W1008 13:11:00.833803 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc40ac050_19b6_4d01_b8c7_a9918a4c0b44.slice/crio-36d76be5dc691c72df7f6aff28736883ce7bab04e95fb59edcc7dbb5314fee5b WatchSource:0}: Error finding container 36d76be5dc691c72df7f6aff28736883ce7bab04e95fb59edcc7dbb5314fee5b: Status 404 returned error can't find the container with id 36d76be5dc691c72df7f6aff28736883ce7bab04e95fb59edcc7dbb5314fee5b Oct 08 13:11:01 crc kubenswrapper[4735]: I1008 13:11:01.190155 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" event={"ID":"c40ac050-19b6-4d01-b8c7-a9918a4c0b44","Type":"ContainerStarted","Data":"36d76be5dc691c72df7f6aff28736883ce7bab04e95fb59edcc7dbb5314fee5b"} Oct 08 13:11:01 crc kubenswrapper[4735]: I1008 13:11:01.546236 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgqxz"] Oct 08 13:11:01 crc kubenswrapper[4735]: I1008 13:11:01.546719 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vgqxz" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="registry-server" containerID="cri-o://855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5" gracePeriod=2 Oct 08 13:11:01 crc kubenswrapper[4735]: I1008 13:11:01.973326 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.098595 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-utilities\") pod \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.099012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mrd7\" (UniqueName: \"kubernetes.io/projected/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-kube-api-access-9mrd7\") pod \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.099139 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-catalog-content\") pod \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\" (UID: \"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7\") " Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.099746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-utilities" (OuterVolumeSpecName: "utilities") pod "5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" (UID: "5df03f05-01ff-4f82-8859-6d9fd0bd0dd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.104282 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-kube-api-access-9mrd7" (OuterVolumeSpecName: "kube-api-access-9mrd7") pod "5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" (UID: "5df03f05-01ff-4f82-8859-6d9fd0bd0dd7"). InnerVolumeSpecName "kube-api-access-9mrd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.200204 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mrd7\" (UniqueName: \"kubernetes.io/projected/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-kube-api-access-9mrd7\") on node \"crc\" DevicePath \"\"" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.200452 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.201938 4735 generic.go:334] "Generic (PLEG): container finished" podID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerID="855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5" exitCode=0 Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.202046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerDied","Data":"855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5"} Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.202152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgqxz" event={"ID":"5df03f05-01ff-4f82-8859-6d9fd0bd0dd7","Type":"ContainerDied","Data":"5a24d5c3cce387970f7f1ba88c3ca0e7d8c810af4cf68d0de9c30d36fd4d12d1"} Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.202233 4735 scope.go:117] "RemoveContainer" containerID="855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.202411 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgqxz" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.225808 4735 scope.go:117] "RemoveContainer" containerID="87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.243114 4735 scope.go:117] "RemoveContainer" containerID="e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.263186 4735 scope.go:117] "RemoveContainer" containerID="855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5" Oct 08 13:11:02 crc kubenswrapper[4735]: E1008 13:11:02.263815 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5\": container with ID starting with 855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5 not found: ID does not exist" containerID="855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.263859 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5"} err="failed to get container status \"855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5\": rpc error: code = NotFound desc = could not find container \"855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5\": container with ID starting with 855b35cd9ddb480e243eb5d80e9d20f185e889b4ee3252c57706fb2da9741db5 not found: ID does not exist" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.263886 4735 scope.go:117] "RemoveContainer" containerID="87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d" Oct 08 13:11:02 crc kubenswrapper[4735]: E1008 13:11:02.264247 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d\": container with ID starting with 87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d not found: ID does not exist" containerID="87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.264299 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d"} err="failed to get container status \"87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d\": rpc error: code = NotFound desc = could not find container \"87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d\": container with ID starting with 87e67f7032bc295ed8b6fbbd939f9549aebf53a1642aea26b593239abdf7d01d not found: ID does not exist" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.264325 4735 scope.go:117] "RemoveContainer" containerID="e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150" Oct 08 13:11:02 crc kubenswrapper[4735]: E1008 13:11:02.264660 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150\": container with ID starting with e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150 not found: ID does not exist" containerID="e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.264684 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150"} err="failed to get container status \"e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150\": rpc error: code = NotFound desc = could not find container \"e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150\": container with ID starting with e14b86a757e0cf2d548b0483f190ee8098ffa9a14daba9622ada629181811150 not found: ID does not exist" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.464182 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" (UID: "5df03f05-01ff-4f82-8859-6d9fd0bd0dd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.504043 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.534583 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgqxz"] Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.540592 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vgqxz"] Oct 08 13:11:02 crc kubenswrapper[4735]: I1008 13:11:02.905949 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" path="/var/lib/kubelet/pods/5df03f05-01ff-4f82-8859-6d9fd0bd0dd7/volumes" Oct 08 13:11:08 crc kubenswrapper[4735]: I1008 13:11:08.244426 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" event={"ID":"c40ac050-19b6-4d01-b8c7-a9918a4c0b44","Type":"ContainerStarted","Data":"d109390bed06eed1d262aa9a04af9f21939c1fe2b4c8166aa3a37c94f89e5015"} Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.114782 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-rjb74" podStartSLOduration=4.331584409 podStartE2EDuration="11.11476666s" podCreationTimestamp="2025-10-08 13:11:00 +0000 UTC" firstStartedPulling="2025-10-08 13:11:00.836124067 +0000 UTC m=+910.644428611" lastFinishedPulling="2025-10-08 13:11:07.619306318 +0000 UTC m=+917.427610862" observedRunningTime="2025-10-08 13:11:08.269137705 +0000 UTC m=+918.077442259" watchObservedRunningTime="2025-10-08 13:11:11.11476666 +0000 UTC m=+920.923071204" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.117716 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-8k9ww"] Oct 08 13:11:11 crc kubenswrapper[4735]: E1008 13:11:11.117953 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="extract-content" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.117973 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="extract-content" Oct 08 13:11:11 crc kubenswrapper[4735]: E1008 13:11:11.117986 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="extract-utilities" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.117993 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="extract-utilities" Oct 08 13:11:11 crc kubenswrapper[4735]: E1008 13:11:11.118005 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="registry-server" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.118011 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="registry-server" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.118105 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df03f05-01ff-4f82-8859-6d9fd0bd0dd7" containerName="registry-server" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.118468 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.120578 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.120587 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.120617 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-x2jxd" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.134725 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-8k9ww"] Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.215853 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/448cb414-b6c4-41dd-b4c8-231e28be6fcb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-8k9ww\" (UID: \"448cb414-b6c4-41dd-b4c8-231e28be6fcb\") " pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.215906 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k88kc\" (UniqueName: \"kubernetes.io/projected/448cb414-b6c4-41dd-b4c8-231e28be6fcb-kube-api-access-k88kc\") pod \"cert-manager-webhook-d969966f-8k9ww\" (UID: \"448cb414-b6c4-41dd-b4c8-231e28be6fcb\") " pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.316900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/448cb414-b6c4-41dd-b4c8-231e28be6fcb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-8k9ww\" (UID: \"448cb414-b6c4-41dd-b4c8-231e28be6fcb\") " pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.316955 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88kc\" (UniqueName: \"kubernetes.io/projected/448cb414-b6c4-41dd-b4c8-231e28be6fcb-kube-api-access-k88kc\") pod \"cert-manager-webhook-d969966f-8k9ww\" (UID: \"448cb414-b6c4-41dd-b4c8-231e28be6fcb\") " pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.341059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/448cb414-b6c4-41dd-b4c8-231e28be6fcb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-8k9ww\" (UID: \"448cb414-b6c4-41dd-b4c8-231e28be6fcb\") " pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.341163 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88kc\" (UniqueName: \"kubernetes.io/projected/448cb414-b6c4-41dd-b4c8-231e28be6fcb-kube-api-access-k88kc\") pod \"cert-manager-webhook-d969966f-8k9ww\" (UID: \"448cb414-b6c4-41dd-b4c8-231e28be6fcb\") " pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.431708 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:11 crc kubenswrapper[4735]: I1008 13:11:11.911096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-8k9ww"] Oct 08 13:11:11 crc kubenswrapper[4735]: W1008 13:11:11.924219 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod448cb414_b6c4_41dd_b4c8_231e28be6fcb.slice/crio-783768a063b463e5b0c478fba276cf297a91be0951ec84ce9b77f114c52e9ae9 WatchSource:0}: Error finding container 783768a063b463e5b0c478fba276cf297a91be0951ec84ce9b77f114c52e9ae9: Status 404 returned error can't find the container with id 783768a063b463e5b0c478fba276cf297a91be0951ec84ce9b77f114c52e9ae9 Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.266172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" event={"ID":"448cb414-b6c4-41dd-b4c8-231e28be6fcb","Type":"ContainerStarted","Data":"783768a063b463e5b0c478fba276cf297a91be0951ec84ce9b77f114c52e9ae9"} Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.765835 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw"] Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.766825 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.768972 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-h5s6x" Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.774309 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw"] Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.936470 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/367d355c-1c4f-465c-b88b-7de1cc0afd3e-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4wpgw\" (UID: \"367d355c-1c4f-465c-b88b-7de1cc0afd3e\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:12 crc kubenswrapper[4735]: I1008 13:11:12.936552 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjmmz\" (UniqueName: \"kubernetes.io/projected/367d355c-1c4f-465c-b88b-7de1cc0afd3e-kube-api-access-vjmmz\") pod \"cert-manager-cainjector-7d9f95dbf-4wpgw\" (UID: \"367d355c-1c4f-465c-b88b-7de1cc0afd3e\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:13 crc kubenswrapper[4735]: I1008 13:11:13.038359 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/367d355c-1c4f-465c-b88b-7de1cc0afd3e-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4wpgw\" (UID: \"367d355c-1c4f-465c-b88b-7de1cc0afd3e\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:13 crc kubenswrapper[4735]: I1008 13:11:13.038439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjmmz\" (UniqueName: \"kubernetes.io/projected/367d355c-1c4f-465c-b88b-7de1cc0afd3e-kube-api-access-vjmmz\") pod \"cert-manager-cainjector-7d9f95dbf-4wpgw\" (UID: \"367d355c-1c4f-465c-b88b-7de1cc0afd3e\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:13 crc kubenswrapper[4735]: I1008 13:11:13.059103 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/367d355c-1c4f-465c-b88b-7de1cc0afd3e-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4wpgw\" (UID: \"367d355c-1c4f-465c-b88b-7de1cc0afd3e\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:13 crc kubenswrapper[4735]: I1008 13:11:13.071603 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjmmz\" (UniqueName: \"kubernetes.io/projected/367d355c-1c4f-465c-b88b-7de1cc0afd3e-kube-api-access-vjmmz\") pod \"cert-manager-cainjector-7d9f95dbf-4wpgw\" (UID: \"367d355c-1c4f-465c-b88b-7de1cc0afd3e\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:13 crc kubenswrapper[4735]: I1008 13:11:13.082826 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" Oct 08 13:11:13 crc kubenswrapper[4735]: I1008 13:11:13.512895 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw"] Oct 08 13:11:13 crc kubenswrapper[4735]: W1008 13:11:13.524295 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod367d355c_1c4f_465c_b88b_7de1cc0afd3e.slice/crio-c9c3463b5cb47d4636b2dfaba56c60b3a55b2557bbab3d21aad8b80100328981 WatchSource:0}: Error finding container c9c3463b5cb47d4636b2dfaba56c60b3a55b2557bbab3d21aad8b80100328981: Status 404 returned error can't find the container with id c9c3463b5cb47d4636b2dfaba56c60b3a55b2557bbab3d21aad8b80100328981 Oct 08 13:11:14 crc kubenswrapper[4735]: I1008 13:11:14.285118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" event={"ID":"367d355c-1c4f-465c-b88b-7de1cc0afd3e","Type":"ContainerStarted","Data":"c9c3463b5cb47d4636b2dfaba56c60b3a55b2557bbab3d21aad8b80100328981"} Oct 08 13:11:18 crc kubenswrapper[4735]: I1008 13:11:18.311957 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" event={"ID":"367d355c-1c4f-465c-b88b-7de1cc0afd3e","Type":"ContainerStarted","Data":"c68a4ccc7297af3f6ad5bddd69a8da0c8e3222447f31d2948c637a23f4f2bffc"} Oct 08 13:11:18 crc kubenswrapper[4735]: I1008 13:11:18.314179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" event={"ID":"448cb414-b6c4-41dd-b4c8-231e28be6fcb","Type":"ContainerStarted","Data":"7942a3422d0707ce95d02e6a9f35739ab10a18d02e698b5acb5c79826ae3736a"} Oct 08 13:11:18 crc kubenswrapper[4735]: I1008 13:11:18.314324 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:18 crc kubenswrapper[4735]: I1008 13:11:18.329900 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4wpgw" podStartSLOduration=2.398599773 podStartE2EDuration="6.329883677s" podCreationTimestamp="2025-10-08 13:11:12 +0000 UTC" firstStartedPulling="2025-10-08 13:11:13.527604021 +0000 UTC m=+923.335908575" lastFinishedPulling="2025-10-08 13:11:17.458887935 +0000 UTC m=+927.267192479" observedRunningTime="2025-10-08 13:11:18.3263426 +0000 UTC m=+928.134647144" watchObservedRunningTime="2025-10-08 13:11:18.329883677 +0000 UTC m=+928.138188221" Oct 08 13:11:18 crc kubenswrapper[4735]: I1008 13:11:18.347339 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" podStartSLOduration=1.8245131479999999 podStartE2EDuration="7.347320988s" podCreationTimestamp="2025-10-08 13:11:11 +0000 UTC" firstStartedPulling="2025-10-08 13:11:11.925936254 +0000 UTC m=+921.734240808" lastFinishedPulling="2025-10-08 13:11:17.448744104 +0000 UTC m=+927.257048648" observedRunningTime="2025-10-08 13:11:18.344239572 +0000 UTC m=+928.152544136" watchObservedRunningTime="2025-10-08 13:11:18.347320988 +0000 UTC m=+928.155625532" Oct 08 13:11:26 crc kubenswrapper[4735]: I1008 13:11:26.435649 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-8k9ww" Oct 08 13:11:29 crc kubenswrapper[4735]: I1008 13:11:29.870619 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-fln6n"] Oct 08 13:11:29 crc kubenswrapper[4735]: I1008 13:11:29.871746 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:29 crc kubenswrapper[4735]: I1008 13:11:29.874607 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4htpr" Oct 08 13:11:29 crc kubenswrapper[4735]: I1008 13:11:29.876089 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-fln6n"] Oct 08 13:11:29 crc kubenswrapper[4735]: I1008 13:11:29.953182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a70a475f-648d-4a84-baa5-b2c2f26c0567-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-fln6n\" (UID: \"a70a475f-648d-4a84-baa5-b2c2f26c0567\") " pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:29 crc kubenswrapper[4735]: I1008 13:11:29.953259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4zmk\" (UniqueName: \"kubernetes.io/projected/a70a475f-648d-4a84-baa5-b2c2f26c0567-kube-api-access-f4zmk\") pod \"cert-manager-7d4cc89fcb-fln6n\" (UID: \"a70a475f-648d-4a84-baa5-b2c2f26c0567\") " pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:30 crc kubenswrapper[4735]: I1008 13:11:30.054369 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a70a475f-648d-4a84-baa5-b2c2f26c0567-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-fln6n\" (UID: \"a70a475f-648d-4a84-baa5-b2c2f26c0567\") " pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:30 crc kubenswrapper[4735]: I1008 13:11:30.054449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4zmk\" (UniqueName: \"kubernetes.io/projected/a70a475f-648d-4a84-baa5-b2c2f26c0567-kube-api-access-f4zmk\") pod \"cert-manager-7d4cc89fcb-fln6n\" (UID: \"a70a475f-648d-4a84-baa5-b2c2f26c0567\") " pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:30 crc kubenswrapper[4735]: I1008 13:11:30.075822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a70a475f-648d-4a84-baa5-b2c2f26c0567-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-fln6n\" (UID: \"a70a475f-648d-4a84-baa5-b2c2f26c0567\") " pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:30 crc kubenswrapper[4735]: I1008 13:11:30.084301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4zmk\" (UniqueName: \"kubernetes.io/projected/a70a475f-648d-4a84-baa5-b2c2f26c0567-kube-api-access-f4zmk\") pod \"cert-manager-7d4cc89fcb-fln6n\" (UID: \"a70a475f-648d-4a84-baa5-b2c2f26c0567\") " pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:30 crc kubenswrapper[4735]: I1008 13:11:30.188616 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" Oct 08 13:11:30 crc kubenswrapper[4735]: I1008 13:11:30.589307 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-fln6n"] Oct 08 13:11:30 crc kubenswrapper[4735]: W1008 13:11:30.595603 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda70a475f_648d_4a84_baa5_b2c2f26c0567.slice/crio-a2c6c1d4b4e6b1296614acfcb18ec5cb0d1389e4bcab6c00cc31feb33ff1d083 WatchSource:0}: Error finding container a2c6c1d4b4e6b1296614acfcb18ec5cb0d1389e4bcab6c00cc31feb33ff1d083: Status 404 returned error can't find the container with id a2c6c1d4b4e6b1296614acfcb18ec5cb0d1389e4bcab6c00cc31feb33ff1d083 Oct 08 13:11:31 crc kubenswrapper[4735]: I1008 13:11:31.409167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" event={"ID":"a70a475f-648d-4a84-baa5-b2c2f26c0567","Type":"ContainerStarted","Data":"c2c633e4b7b04cac525e280d483fdff1ae9a455d44600163969186e18a4819fb"} Oct 08 13:11:31 crc kubenswrapper[4735]: I1008 13:11:31.409708 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" event={"ID":"a70a475f-648d-4a84-baa5-b2c2f26c0567","Type":"ContainerStarted","Data":"a2c6c1d4b4e6b1296614acfcb18ec5cb0d1389e4bcab6c00cc31feb33ff1d083"} Oct 08 13:11:31 crc kubenswrapper[4735]: I1008 13:11:31.425035 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-fln6n" podStartSLOduration=2.425018208 podStartE2EDuration="2.425018208s" podCreationTimestamp="2025-10-08 13:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:11:31.422508696 +0000 UTC m=+941.230813240" watchObservedRunningTime="2025-10-08 13:11:31.425018208 +0000 UTC m=+941.233322762" Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.868247 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sxwkn"] Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.869723 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.881607 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9zzhw" Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.881628 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.881615 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.902189 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sxwkn"] Oct 08 13:11:39 crc kubenswrapper[4735]: I1008 13:11:39.990113 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vb5\" (UniqueName: \"kubernetes.io/projected/20ee7dbe-9649-44d9-bb2d-7f15121910c6-kube-api-access-g6vb5\") pod \"openstack-operator-index-sxwkn\" (UID: \"20ee7dbe-9649-44d9-bb2d-7f15121910c6\") " pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:40 crc kubenswrapper[4735]: I1008 13:11:40.091169 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vb5\" (UniqueName: \"kubernetes.io/projected/20ee7dbe-9649-44d9-bb2d-7f15121910c6-kube-api-access-g6vb5\") pod \"openstack-operator-index-sxwkn\" (UID: \"20ee7dbe-9649-44d9-bb2d-7f15121910c6\") " pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:40 crc kubenswrapper[4735]: I1008 13:11:40.108889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vb5\" (UniqueName: \"kubernetes.io/projected/20ee7dbe-9649-44d9-bb2d-7f15121910c6-kube-api-access-g6vb5\") pod \"openstack-operator-index-sxwkn\" (UID: \"20ee7dbe-9649-44d9-bb2d-7f15121910c6\") " pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:40 crc kubenswrapper[4735]: I1008 13:11:40.197432 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:40 crc kubenswrapper[4735]: I1008 13:11:40.619169 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sxwkn"] Oct 08 13:11:40 crc kubenswrapper[4735]: W1008 13:11:40.623821 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20ee7dbe_9649_44d9_bb2d_7f15121910c6.slice/crio-3c9f7e4c63ca3b9be019f9a06f8c1ec8d808a6a7652e30ae185422d725a3e633 WatchSource:0}: Error finding container 3c9f7e4c63ca3b9be019f9a06f8c1ec8d808a6a7652e30ae185422d725a3e633: Status 404 returned error can't find the container with id 3c9f7e4c63ca3b9be019f9a06f8c1ec8d808a6a7652e30ae185422d725a3e633 Oct 08 13:11:41 crc kubenswrapper[4735]: I1008 13:11:41.475935 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sxwkn" event={"ID":"20ee7dbe-9649-44d9-bb2d-7f15121910c6","Type":"ContainerStarted","Data":"3c9f7e4c63ca3b9be019f9a06f8c1ec8d808a6a7652e30ae185422d725a3e633"} Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.237977 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-sxwkn"] Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.495758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sxwkn" event={"ID":"20ee7dbe-9649-44d9-bb2d-7f15121910c6","Type":"ContainerStarted","Data":"664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b"} Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.511337 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sxwkn" podStartSLOduration=2.817952876 podStartE2EDuration="4.511319738s" podCreationTimestamp="2025-10-08 13:11:39 +0000 UTC" firstStartedPulling="2025-10-08 13:11:40.625731616 +0000 UTC m=+950.434036160" lastFinishedPulling="2025-10-08 13:11:42.319098468 +0000 UTC m=+952.127403022" observedRunningTime="2025-10-08 13:11:43.51058458 +0000 UTC m=+953.318889174" watchObservedRunningTime="2025-10-08 13:11:43.511319738 +0000 UTC m=+953.319624282" Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.859022 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4pqs9"] Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.860963 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.863240 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4pqs9"] Oct 08 13:11:43 crc kubenswrapper[4735]: I1008 13:11:43.956179 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzcng\" (UniqueName: \"kubernetes.io/projected/d18440b1-cd19-4186-ae4c-40d0ed7adf79-kube-api-access-fzcng\") pod \"openstack-operator-index-4pqs9\" (UID: \"d18440b1-cd19-4186-ae4c-40d0ed7adf79\") " pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.057986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzcng\" (UniqueName: \"kubernetes.io/projected/d18440b1-cd19-4186-ae4c-40d0ed7adf79-kube-api-access-fzcng\") pod \"openstack-operator-index-4pqs9\" (UID: \"d18440b1-cd19-4186-ae4c-40d0ed7adf79\") " pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.090506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzcng\" (UniqueName: \"kubernetes.io/projected/d18440b1-cd19-4186-ae4c-40d0ed7adf79-kube-api-access-fzcng\") pod \"openstack-operator-index-4pqs9\" (UID: \"d18440b1-cd19-4186-ae4c-40d0ed7adf79\") " pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.188923 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.502619 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-sxwkn" podUID="20ee7dbe-9649-44d9-bb2d-7f15121910c6" containerName="registry-server" containerID="cri-o://664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b" gracePeriod=2 Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.608559 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4pqs9"] Oct 08 13:11:44 crc kubenswrapper[4735]: W1008 13:11:44.617010 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18440b1_cd19_4186_ae4c_40d0ed7adf79.slice/crio-1e6077a41fa3dadd59f1720d55f8fcf08e3614fa35d799d49fb0772befa49756 WatchSource:0}: Error finding container 1e6077a41fa3dadd59f1720d55f8fcf08e3614fa35d799d49fb0772befa49756: Status 404 returned error can't find the container with id 1e6077a41fa3dadd59f1720d55f8fcf08e3614fa35d799d49fb0772befa49756 Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.796909 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.971109 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6vb5\" (UniqueName: \"kubernetes.io/projected/20ee7dbe-9649-44d9-bb2d-7f15121910c6-kube-api-access-g6vb5\") pod \"20ee7dbe-9649-44d9-bb2d-7f15121910c6\" (UID: \"20ee7dbe-9649-44d9-bb2d-7f15121910c6\") " Oct 08 13:11:44 crc kubenswrapper[4735]: I1008 13:11:44.978008 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ee7dbe-9649-44d9-bb2d-7f15121910c6-kube-api-access-g6vb5" (OuterVolumeSpecName: "kube-api-access-g6vb5") pod "20ee7dbe-9649-44d9-bb2d-7f15121910c6" (UID: "20ee7dbe-9649-44d9-bb2d-7f15121910c6"). InnerVolumeSpecName "kube-api-access-g6vb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.072829 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6vb5\" (UniqueName: \"kubernetes.io/projected/20ee7dbe-9649-44d9-bb2d-7f15121910c6-kube-api-access-g6vb5\") on node \"crc\" DevicePath \"\"" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.510192 4735 generic.go:334] "Generic (PLEG): container finished" podID="20ee7dbe-9649-44d9-bb2d-7f15121910c6" containerID="664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b" exitCode=0 Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.510240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sxwkn" event={"ID":"20ee7dbe-9649-44d9-bb2d-7f15121910c6","Type":"ContainerDied","Data":"664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b"} Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.510297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sxwkn" event={"ID":"20ee7dbe-9649-44d9-bb2d-7f15121910c6","Type":"ContainerDied","Data":"3c9f7e4c63ca3b9be019f9a06f8c1ec8d808a6a7652e30ae185422d725a3e633"} Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.510320 4735 scope.go:117] "RemoveContainer" containerID="664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.510259 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sxwkn" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.513409 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4pqs9" event={"ID":"d18440b1-cd19-4186-ae4c-40d0ed7adf79","Type":"ContainerStarted","Data":"94b08a7f3b9049f6926855e84bc92a997df831996f4b787d7b39cece08355d91"} Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.513836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4pqs9" event={"ID":"d18440b1-cd19-4186-ae4c-40d0ed7adf79","Type":"ContainerStarted","Data":"1e6077a41fa3dadd59f1720d55f8fcf08e3614fa35d799d49fb0772befa49756"} Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.531707 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4pqs9" podStartSLOduration=2.093511913 podStartE2EDuration="2.531689421s" podCreationTimestamp="2025-10-08 13:11:43 +0000 UTC" firstStartedPulling="2025-10-08 13:11:44.622860184 +0000 UTC m=+954.431164738" lastFinishedPulling="2025-10-08 13:11:45.061037662 +0000 UTC m=+954.869342246" observedRunningTime="2025-10-08 13:11:45.53121808 +0000 UTC m=+955.339522704" watchObservedRunningTime="2025-10-08 13:11:45.531689421 +0000 UTC m=+955.339993975" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.544129 4735 scope.go:117] "RemoveContainer" containerID="664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b" Oct 08 13:11:45 crc kubenswrapper[4735]: E1008 13:11:45.544777 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b\": container with ID starting with 664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b not found: ID does not exist" containerID="664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.544835 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b"} err="failed to get container status \"664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b\": rpc error: code = NotFound desc = could not find container \"664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b\": container with ID starting with 664ab8cee4c7251ee38d1ae179055bf287c826583e6829a789f7747ca4b6e60b not found: ID does not exist" Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.552363 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-sxwkn"] Oct 08 13:11:45 crc kubenswrapper[4735]: I1008 13:11:45.560107 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-sxwkn"] Oct 08 13:11:46 crc kubenswrapper[4735]: I1008 13:11:46.905703 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ee7dbe-9649-44d9-bb2d-7f15121910c6" path="/var/lib/kubelet/pods/20ee7dbe-9649-44d9-bb2d-7f15121910c6/volumes" Oct 08 13:11:54 crc kubenswrapper[4735]: I1008 13:11:54.189820 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:54 crc kubenswrapper[4735]: I1008 13:11:54.190625 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:54 crc kubenswrapper[4735]: I1008 13:11:54.229297 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:11:54 crc kubenswrapper[4735]: I1008 13:11:54.640309 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4pqs9" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.754390 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg"] Oct 08 13:12:01 crc kubenswrapper[4735]: E1008 13:12:01.754941 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ee7dbe-9649-44d9-bb2d-7f15121910c6" containerName="registry-server" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.754956 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ee7dbe-9649-44d9-bb2d-7f15121910c6" containerName="registry-server" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.755071 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ee7dbe-9649-44d9-bb2d-7f15121910c6" containerName="registry-server" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.755827 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.759171 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xd7nj" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.768183 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg"] Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.894831 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-bundle\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.894894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-util\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.895785 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znhls\" (UniqueName: \"kubernetes.io/projected/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-kube-api-access-znhls\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.997490 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-util\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.997605 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znhls\" (UniqueName: \"kubernetes.io/projected/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-kube-api-access-znhls\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.997713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-bundle\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.998307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-util\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:01 crc kubenswrapper[4735]: I1008 13:12:01.998615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-bundle\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:02 crc kubenswrapper[4735]: I1008 13:12:02.016383 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znhls\" (UniqueName: \"kubernetes.io/projected/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-kube-api-access-znhls\") pod \"eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:02 crc kubenswrapper[4735]: I1008 13:12:02.086415 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:02 crc kubenswrapper[4735]: I1008 13:12:02.490236 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg"] Oct 08 13:12:02 crc kubenswrapper[4735]: I1008 13:12:02.661894 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" event={"ID":"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9","Type":"ContainerStarted","Data":"a726abff10a6b68c698b6015310c3bc13b1579b8bc108d68882c9325167ef823"} Oct 08 13:12:02 crc kubenswrapper[4735]: I1008 13:12:02.661951 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" event={"ID":"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9","Type":"ContainerStarted","Data":"a03585f5e4077c8d5880bbb25a49d9b2155ce366088851b288b0fd0b4d152af6"} Oct 08 13:12:03 crc kubenswrapper[4735]: I1008 13:12:03.668661 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerID="a726abff10a6b68c698b6015310c3bc13b1579b8bc108d68882c9325167ef823" exitCode=0 Oct 08 13:12:03 crc kubenswrapper[4735]: I1008 13:12:03.668732 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" event={"ID":"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9","Type":"ContainerDied","Data":"a726abff10a6b68c698b6015310c3bc13b1579b8bc108d68882c9325167ef823"} Oct 08 13:12:05 crc kubenswrapper[4735]: I1008 13:12:05.680023 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerID="cdb32c83408ebcb7dfe10c23373b67a640eaf2a070cf1ae6324316a016b28d1a" exitCode=0 Oct 08 13:12:05 crc kubenswrapper[4735]: I1008 13:12:05.680064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" event={"ID":"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9","Type":"ContainerDied","Data":"cdb32c83408ebcb7dfe10c23373b67a640eaf2a070cf1ae6324316a016b28d1a"} Oct 08 13:12:06 crc kubenswrapper[4735]: I1008 13:12:06.693722 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerID="ca2ced221987ba54015f71339cf37c8529e5518bf55e18def09b8e587bde76c6" exitCode=0 Oct 08 13:12:06 crc kubenswrapper[4735]: I1008 13:12:06.693767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" event={"ID":"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9","Type":"ContainerDied","Data":"ca2ced221987ba54015f71339cf37c8529e5518bf55e18def09b8e587bde76c6"} Oct 08 13:12:07 crc kubenswrapper[4735]: I1008 13:12:07.933105 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:07 crc kubenswrapper[4735]: I1008 13:12:07.971888 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-bundle\") pod \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " Oct 08 13:12:07 crc kubenswrapper[4735]: I1008 13:12:07.971996 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-util\") pod \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " Oct 08 13:12:07 crc kubenswrapper[4735]: I1008 13:12:07.972108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znhls\" (UniqueName: \"kubernetes.io/projected/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-kube-api-access-znhls\") pod \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\" (UID: \"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9\") " Oct 08 13:12:07 crc kubenswrapper[4735]: I1008 13:12:07.973644 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-bundle" (OuterVolumeSpecName: "bundle") pod "5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" (UID: "5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:12:07 crc kubenswrapper[4735]: I1008 13:12:07.977327 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-kube-api-access-znhls" (OuterVolumeSpecName: "kube-api-access-znhls") pod "5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" (UID: "5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9"). InnerVolumeSpecName "kube-api-access-znhls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.063718 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-util" (OuterVolumeSpecName: "util") pod "5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" (UID: "5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.074623 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.074778 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-util\") on node \"crc\" DevicePath \"\"" Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.074836 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znhls\" (UniqueName: \"kubernetes.io/projected/5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9-kube-api-access-znhls\") on node \"crc\" DevicePath \"\"" Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.715601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" event={"ID":"5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9","Type":"ContainerDied","Data":"a03585f5e4077c8d5880bbb25a49d9b2155ce366088851b288b0fd0b4d152af6"} Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.715969 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a03585f5e4077c8d5880bbb25a49d9b2155ce366088851b288b0fd0b4d152af6" Oct 08 13:12:08 crc kubenswrapper[4735]: I1008 13:12:08.715705 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.144420 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm"] Oct 08 13:12:14 crc kubenswrapper[4735]: E1008 13:12:14.145136 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="pull" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.145147 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="pull" Oct 08 13:12:14 crc kubenswrapper[4735]: E1008 13:12:14.145160 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="util" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.145165 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="util" Oct 08 13:12:14 crc kubenswrapper[4735]: E1008 13:12:14.145183 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="extract" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.145189 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="extract" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.145289 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9" containerName="extract" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.145946 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.150049 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qnhdc" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.184111 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm"] Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.258981 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr74b\" (UniqueName: \"kubernetes.io/projected/e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae-kube-api-access-tr74b\") pod \"openstack-operator-controller-operator-55f65988b-4qvmm\" (UID: \"e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae\") " pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.360452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr74b\" (UniqueName: \"kubernetes.io/projected/e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae-kube-api-access-tr74b\") pod \"openstack-operator-controller-operator-55f65988b-4qvmm\" (UID: \"e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae\") " pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.382435 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr74b\" (UniqueName: \"kubernetes.io/projected/e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae-kube-api-access-tr74b\") pod \"openstack-operator-controller-operator-55f65988b-4qvmm\" (UID: \"e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae\") " pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.466243 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:14 crc kubenswrapper[4735]: I1008 13:12:14.887371 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm"] Oct 08 13:12:15 crc kubenswrapper[4735]: I1008 13:12:15.763251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" event={"ID":"e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae","Type":"ContainerStarted","Data":"67586688cb02318b0af5f1a71c0c4a29b0c49a7fdf78fe2ca8bd8466dfa88e6a"} Oct 08 13:12:19 crc kubenswrapper[4735]: I1008 13:12:19.795337 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" event={"ID":"e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae","Type":"ContainerStarted","Data":"03bdc16d1d359d6e56678d5096b1b8fa5287f17b79cd8184e1cb7d381c0e9007"} Oct 08 13:12:22 crc kubenswrapper[4735]: I1008 13:12:22.818794 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" event={"ID":"e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae","Type":"ContainerStarted","Data":"edc5473e7bca7da8bfbf7131e35ed9919c1c4fb35c9b5773003ac326f56cac54"} Oct 08 13:12:22 crc kubenswrapper[4735]: I1008 13:12:22.820281 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:23 crc kubenswrapper[4735]: I1008 13:12:23.828691 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" Oct 08 13:12:23 crc kubenswrapper[4735]: I1008 13:12:23.863290 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55f65988b-4qvmm" podStartSLOduration=3.060240286 podStartE2EDuration="9.863269247s" podCreationTimestamp="2025-10-08 13:12:14 +0000 UTC" firstStartedPulling="2025-10-08 13:12:14.898985331 +0000 UTC m=+984.707289875" lastFinishedPulling="2025-10-08 13:12:21.702014292 +0000 UTC m=+991.510318836" observedRunningTime="2025-10-08 13:12:22.858696124 +0000 UTC m=+992.667001078" watchObservedRunningTime="2025-10-08 13:12:23.863269247 +0000 UTC m=+993.671573791" Oct 08 13:12:24 crc kubenswrapper[4735]: I1008 13:12:24.878653 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:12:24 crc kubenswrapper[4735]: I1008 13:12:24.878709 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:12:54 crc kubenswrapper[4735]: I1008 13:12:54.878354 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:12:54 crc kubenswrapper[4735]: I1008 13:12:54.879600 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.845899 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.847035 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.849380 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-m54b7" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.861554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.867123 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.869077 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.870732 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9ctvm" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.894145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvtc\" (UniqueName: \"kubernetes.io/projected/b5933297-7fc6-41f5-9ac0-55e3ada0e45e-kube-api-access-fdvtc\") pod \"cinder-operator-controller-manager-7b7fb68549-9rbsk\" (UID: \"b5933297-7fc6-41f5-9ac0-55e3ada0e45e\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.894199 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9pt\" (UniqueName: \"kubernetes.io/projected/4e11d5ed-b404-4449-88f2-428ee510aa3f-kube-api-access-rd9pt\") pod \"barbican-operator-controller-manager-658bdf4b74-wvpgj\" (UID: \"4e11d5ed-b404-4449-88f2-428ee510aa3f\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.905688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.905727 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.906532 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.907189 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.907856 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.911964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rsv2s" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.912049 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-cqfpw" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.915212 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.940934 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.963559 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.964565 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.969547 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-p8j5t" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.993335 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg"] Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.994273 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.996208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtn7v\" (UniqueName: \"kubernetes.io/projected/38e7b3e3-9c02-4737-b500-7a9c01d5e92a-kube-api-access-mtn7v\") pod \"horizon-operator-controller-manager-7ffbcb7588-vbvt8\" (UID: \"38e7b3e3-9c02-4737-b500-7a9c01d5e92a\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.996263 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvtc\" (UniqueName: \"kubernetes.io/projected/b5933297-7fc6-41f5-9ac0-55e3ada0e45e-kube-api-access-fdvtc\") pod \"cinder-operator-controller-manager-7b7fb68549-9rbsk\" (UID: \"b5933297-7fc6-41f5-9ac0-55e3ada0e45e\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.996285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9pt\" (UniqueName: \"kubernetes.io/projected/4e11d5ed-b404-4449-88f2-428ee510aa3f-kube-api-access-rd9pt\") pod \"barbican-operator-controller-manager-658bdf4b74-wvpgj\" (UID: \"4e11d5ed-b404-4449-88f2-428ee510aa3f\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.996312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnfhp\" (UniqueName: \"kubernetes.io/projected/0c811b08-74de-4fd3-a06b-5eb2021544a9-kube-api-access-jnfhp\") pod \"designate-operator-controller-manager-85d5d9dd78-bsvbv\" (UID: \"0c811b08-74de-4fd3-a06b-5eb2021544a9\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:12:58 crc kubenswrapper[4735]: I1008 13:12:58.996331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvvh\" (UniqueName: \"kubernetes.io/projected/0bdd70f2-2b72-4787-8434-1b3218cf5f68-kube-api-access-ssvvh\") pod \"glance-operator-controller-manager-84b9b84486-rzjz4\" (UID: \"0bdd70f2-2b72-4787-8434-1b3218cf5f68\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.008679 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.009767 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.013389 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.016838 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wrgxb" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.018407 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nw88t" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.022247 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.042420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9pt\" (UniqueName: \"kubernetes.io/projected/4e11d5ed-b404-4449-88f2-428ee510aa3f-kube-api-access-rd9pt\") pod \"barbican-operator-controller-manager-658bdf4b74-wvpgj\" (UID: \"4e11d5ed-b404-4449-88f2-428ee510aa3f\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.042890 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.043835 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.047441 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-svnwc" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.057856 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvtc\" (UniqueName: \"kubernetes.io/projected/b5933297-7fc6-41f5-9ac0-55e3ada0e45e-kube-api-access-fdvtc\") pod \"cinder-operator-controller-manager-7b7fb68549-9rbsk\" (UID: \"b5933297-7fc6-41f5-9ac0-55e3ada0e45e\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.057956 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.068563 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.074577 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.097589 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.110251 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-q4bch" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.111977 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnfhp\" (UniqueName: \"kubernetes.io/projected/0c811b08-74de-4fd3-a06b-5eb2021544a9-kube-api-access-jnfhp\") pod \"designate-operator-controller-manager-85d5d9dd78-bsvbv\" (UID: \"0c811b08-74de-4fd3-a06b-5eb2021544a9\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.112908 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvvh\" (UniqueName: \"kubernetes.io/projected/0bdd70f2-2b72-4787-8434-1b3218cf5f68-kube-api-access-ssvvh\") pod \"glance-operator-controller-manager-84b9b84486-rzjz4\" (UID: \"0bdd70f2-2b72-4787-8434-1b3218cf5f68\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.115749 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-cert\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.115816 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn56\" (UniqueName: \"kubernetes.io/projected/18146224-6500-499d-9fee-7d751700c432-kube-api-access-5nn56\") pod \"ironic-operator-controller-manager-9c5c78d49-6jbp9\" (UID: \"18146224-6500-499d-9fee-7d751700c432\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.116172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtn7v\" (UniqueName: \"kubernetes.io/projected/38e7b3e3-9c02-4737-b500-7a9c01d5e92a-kube-api-access-mtn7v\") pod \"horizon-operator-controller-manager-7ffbcb7588-vbvt8\" (UID: \"38e7b3e3-9c02-4737-b500-7a9c01d5e92a\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.116257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59llp\" (UniqueName: \"kubernetes.io/projected/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-kube-api-access-59llp\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.126050 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.161655 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.166341 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.177759 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvvh\" (UniqueName: \"kubernetes.io/projected/0bdd70f2-2b72-4787-8434-1b3218cf5f68-kube-api-access-ssvvh\") pod \"glance-operator-controller-manager-84b9b84486-rzjz4\" (UID: \"0bdd70f2-2b72-4787-8434-1b3218cf5f68\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.181648 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.183969 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.188038 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.189031 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.189861 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sfqmd" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.190401 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.195920 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5254m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.203953 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtn7v\" (UniqueName: \"kubernetes.io/projected/38e7b3e3-9c02-4737-b500-7a9c01d5e92a-kube-api-access-mtn7v\") pod \"horizon-operator-controller-manager-7ffbcb7588-vbvt8\" (UID: \"38e7b3e3-9c02-4737-b500-7a9c01d5e92a\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.206767 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnfhp\" (UniqueName: \"kubernetes.io/projected/0c811b08-74de-4fd3-a06b-5eb2021544a9-kube-api-access-jnfhp\") pod \"designate-operator-controller-manager-85d5d9dd78-bsvbv\" (UID: \"0c811b08-74de-4fd3-a06b-5eb2021544a9\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.211869 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.217622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khxrh\" (UniqueName: \"kubernetes.io/projected/ea14d44e-0e35-42d3-8618-cc49e1888e79-kube-api-access-khxrh\") pod \"keystone-operator-controller-manager-55b6b7c7b8-wm2bm\" (UID: \"ea14d44e-0e35-42d3-8618-cc49e1888e79\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.217678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59llp\" (UniqueName: \"kubernetes.io/projected/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-kube-api-access-59llp\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.217704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v677\" (UniqueName: \"kubernetes.io/projected/7899c8e4-5194-486e-b373-b27edda3e745-kube-api-access-6v677\") pod \"heat-operator-controller-manager-858f76bbdd-98fgg\" (UID: \"7899c8e4-5194-486e-b373-b27edda3e745\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.218355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-cert\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.218395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn56\" (UniqueName: \"kubernetes.io/projected/18146224-6500-499d-9fee-7d751700c432-kube-api-access-5nn56\") pod \"ironic-operator-controller-manager-9c5c78d49-6jbp9\" (UID: \"18146224-6500-499d-9fee-7d751700c432\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:12:59 crc kubenswrapper[4735]: E1008 13:12:59.218719 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 13:12:59 crc kubenswrapper[4735]: E1008 13:12:59.218766 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-cert podName:1fc477a5-eda8-4dbd-87e8-83e4294edc0e nodeName:}" failed. No retries permitted until 2025-10-08 13:12:59.718750654 +0000 UTC m=+1029.527055188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-cert") pod "infra-operator-controller-manager-656bcbd775-xkw5z" (UID: "1fc477a5-eda8-4dbd-87e8-83e4294edc0e") : secret "infra-operator-webhook-server-cert" not found Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.229874 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.244750 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.245118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59llp\" (UniqueName: \"kubernetes.io/projected/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-kube-api-access-59llp\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.250140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn56\" (UniqueName: \"kubernetes.io/projected/18146224-6500-499d-9fee-7d751700c432-kube-api-access-5nn56\") pod \"ironic-operator-controller-manager-9c5c78d49-6jbp9\" (UID: \"18146224-6500-499d-9fee-7d751700c432\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.250221 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.251546 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.252039 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.256926 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xkc7x" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.257128 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-v29rs"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.258195 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.259138 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8qhgp" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.275996 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-v29rs"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.296058 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.299188 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.307175 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.308478 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.311747 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9l8mp" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.319719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqgb\" (UniqueName: \"kubernetes.io/projected/9fc075dd-f4c6-4185-bed2-a27ff7e638e6-kube-api-access-5sqgb\") pod \"mariadb-operator-controller-manager-f9fb45f8f-522n9\" (UID: \"9fc075dd-f4c6-4185-bed2-a27ff7e638e6\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.319761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khxrh\" (UniqueName: \"kubernetes.io/projected/ea14d44e-0e35-42d3-8618-cc49e1888e79-kube-api-access-khxrh\") pod \"keystone-operator-controller-manager-55b6b7c7b8-wm2bm\" (UID: \"ea14d44e-0e35-42d3-8618-cc49e1888e79\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.319786 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4m2j\" (UniqueName: \"kubernetes.io/projected/a1edbfe6-4871-4748-a468-6c6fcd2db21a-kube-api-access-q4m2j\") pod \"manila-operator-controller-manager-5f67fbc655-r786m\" (UID: \"a1edbfe6-4871-4748-a468-6c6fcd2db21a\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.319816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v677\" (UniqueName: \"kubernetes.io/projected/7899c8e4-5194-486e-b373-b27edda3e745-kube-api-access-6v677\") pod \"heat-operator-controller-manager-858f76bbdd-98fgg\" (UID: \"7899c8e4-5194-486e-b373-b27edda3e745\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.336115 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.346993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v677\" (UniqueName: \"kubernetes.io/projected/7899c8e4-5194-486e-b373-b27edda3e745-kube-api-access-6v677\") pod \"heat-operator-controller-manager-858f76bbdd-98fgg\" (UID: \"7899c8e4-5194-486e-b373-b27edda3e745\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.348158 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khxrh\" (UniqueName: \"kubernetes.io/projected/ea14d44e-0e35-42d3-8618-cc49e1888e79-kube-api-access-khxrh\") pod \"keystone-operator-controller-manager-55b6b7c7b8-wm2bm\" (UID: \"ea14d44e-0e35-42d3-8618-cc49e1888e79\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.349443 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.350380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.357266 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-b8w64" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.357657 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.358658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.360699 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.360864 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6bmld" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.386629 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.401619 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.408980 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.417135 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jzs5s" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.420712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqgb\" (UniqueName: \"kubernetes.io/projected/9fc075dd-f4c6-4185-bed2-a27ff7e638e6-kube-api-access-5sqgb\") pod \"mariadb-operator-controller-manager-f9fb45f8f-522n9\" (UID: \"9fc075dd-f4c6-4185-bed2-a27ff7e638e6\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.420758 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl5l\" (UniqueName: \"kubernetes.io/projected/9a5fdef5-1d5b-40d0-9675-da52d2a07d80-kube-api-access-zwl5l\") pod \"neutron-operator-controller-manager-79d585cb66-c5274\" (UID: \"9a5fdef5-1d5b-40d0-9675-da52d2a07d80\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.420780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdt2b\" (UniqueName: \"kubernetes.io/projected/c5d09b12-6911-4d37-8210-d067b6220b73-kube-api-access-xdt2b\") pod \"nova-operator-controller-manager-5df598886f-v29rs\" (UID: \"c5d09b12-6911-4d37-8210-d067b6220b73\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.420803 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w5rc\" (UniqueName: \"kubernetes.io/projected/d7319151-d690-4595-9b86-72b7d5b812aa-kube-api-access-7w5rc\") pod \"octavia-operator-controller-manager-69fdcfc5f5-7k9wv\" (UID: \"d7319151-d690-4595-9b86-72b7d5b812aa\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.420820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4m2j\" (UniqueName: \"kubernetes.io/projected/a1edbfe6-4871-4748-a468-6c6fcd2db21a-kube-api-access-q4m2j\") pod \"manila-operator-controller-manager-5f67fbc655-r786m\" (UID: \"a1edbfe6-4871-4748-a468-6c6fcd2db21a\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.423538 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.437332 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.437493 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.440312 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-58jtg" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.443966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqgb\" (UniqueName: \"kubernetes.io/projected/9fc075dd-f4c6-4185-bed2-a27ff7e638e6-kube-api-access-5sqgb\") pod \"mariadb-operator-controller-manager-f9fb45f8f-522n9\" (UID: \"9fc075dd-f4c6-4185-bed2-a27ff7e638e6\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.445483 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.454719 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.459604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4m2j\" (UniqueName: \"kubernetes.io/projected/a1edbfe6-4871-4748-a468-6c6fcd2db21a-kube-api-access-q4m2j\") pod \"manila-operator-controller-manager-5f67fbc655-r786m\" (UID: \"a1edbfe6-4871-4748-a468-6c6fcd2db21a\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.468580 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.469457 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.479445 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.480838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.497991 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.504244 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bv8fm" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522427 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmlnb\" (UniqueName: \"kubernetes.io/projected/ad73d925-823e-437b-ba07-9125f43b98b7-kube-api-access-fmlnb\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522488 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whmsx\" (UniqueName: \"kubernetes.io/projected/46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a-kube-api-access-whmsx\") pod \"ovn-operator-controller-manager-79db49b9fb-gnl2m\" (UID: \"46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522545 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl5l\" (UniqueName: \"kubernetes.io/projected/9a5fdef5-1d5b-40d0-9675-da52d2a07d80-kube-api-access-zwl5l\") pod \"neutron-operator-controller-manager-79d585cb66-c5274\" (UID: \"9a5fdef5-1d5b-40d0-9675-da52d2a07d80\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522572 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdt2b\" (UniqueName: \"kubernetes.io/projected/c5d09b12-6911-4d37-8210-d067b6220b73-kube-api-access-xdt2b\") pod \"nova-operator-controller-manager-5df598886f-v29rs\" (UID: \"c5d09b12-6911-4d37-8210-d067b6220b73\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8lc\" (UniqueName: \"kubernetes.io/projected/157b7f16-11b3-452f-8c03-287aeefb2631-kube-api-access-fq8lc\") pod \"placement-operator-controller-manager-68b6c87b68-4vb9q\" (UID: \"157b7f16-11b3-452f-8c03-287aeefb2631\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522622 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w5rc\" (UniqueName: \"kubernetes.io/projected/d7319151-d690-4595-9b86-72b7d5b812aa-kube-api-access-7w5rc\") pod \"octavia-operator-controller-manager-69fdcfc5f5-7k9wv\" (UID: \"d7319151-d690-4595-9b86-72b7d5b812aa\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.522916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.561287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl5l\" (UniqueName: \"kubernetes.io/projected/9a5fdef5-1d5b-40d0-9675-da52d2a07d80-kube-api-access-zwl5l\") pod \"neutron-operator-controller-manager-79d585cb66-c5274\" (UID: \"9a5fdef5-1d5b-40d0-9675-da52d2a07d80\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.562118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w5rc\" (UniqueName: \"kubernetes.io/projected/d7319151-d690-4595-9b86-72b7d5b812aa-kube-api-access-7w5rc\") pod \"octavia-operator-controller-manager-69fdcfc5f5-7k9wv\" (UID: \"d7319151-d690-4595-9b86-72b7d5b812aa\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.570873 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56c698c775-zlnwj"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.573062 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdt2b\" (UniqueName: \"kubernetes.io/projected/c5d09b12-6911-4d37-8210-d067b6220b73-kube-api-access-xdt2b\") pod \"nova-operator-controller-manager-5df598886f-v29rs\" (UID: \"c5d09b12-6911-4d37-8210-d067b6220b73\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.574713 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.574863 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.578402 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-h5js9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.592152 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.615939 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56c698c775-zlnwj"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.616376 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.638333 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.640274 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.659244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7hfg\" (UniqueName: \"kubernetes.io/projected/31d49be4-108d-4004-973f-0fd539dd1037-kube-api-access-h7hfg\") pod \"swift-operator-controller-manager-db6d7f97b-rmpsl\" (UID: \"31d49be4-108d-4004-973f-0fd539dd1037\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.659302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49pbc\" (UniqueName: \"kubernetes.io/projected/11418f7b-2467-49f3-ad5d-1786f70c023b-kube-api-access-49pbc\") pod \"telemetry-operator-controller-manager-76796d4c6b-dmw5m\" (UID: \"11418f7b-2467-49f3-ad5d-1786f70c023b\") " pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.659529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmlnb\" (UniqueName: \"kubernetes.io/projected/ad73d925-823e-437b-ba07-9125f43b98b7-kube-api-access-fmlnb\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.659586 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whmsx\" (UniqueName: \"kubernetes.io/projected/46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a-kube-api-access-whmsx\") pod \"ovn-operator-controller-manager-79db49b9fb-gnl2m\" (UID: \"46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.659645 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8lc\" (UniqueName: \"kubernetes.io/projected/157b7f16-11b3-452f-8c03-287aeefb2631-kube-api-access-fq8lc\") pod \"placement-operator-controller-manager-68b6c87b68-4vb9q\" (UID: \"157b7f16-11b3-452f-8c03-287aeefb2631\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.659708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:12:59 crc kubenswrapper[4735]: E1008 13:12:59.661278 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 13:12:59 crc kubenswrapper[4735]: E1008 13:12:59.661347 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert podName:ad73d925-823e-437b-ba07-9125f43b98b7 nodeName:}" failed. No retries permitted until 2025-10-08 13:13:00.1613214 +0000 UTC m=+1029.969625944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert") pod "openstack-baremetal-operator-controller-manager-747747dfccm774j" (UID: "ad73d925-823e-437b-ba07-9125f43b98b7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.664082 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.698361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmlnb\" (UniqueName: \"kubernetes.io/projected/ad73d925-823e-437b-ba07-9125f43b98b7-kube-api-access-fmlnb\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.700387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whmsx\" (UniqueName: \"kubernetes.io/projected/46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a-kube-api-access-whmsx\") pod \"ovn-operator-controller-manager-79db49b9fb-gnl2m\" (UID: \"46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.720778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8lc\" (UniqueName: \"kubernetes.io/projected/157b7f16-11b3-452f-8c03-287aeefb2631-kube-api-access-fq8lc\") pod \"placement-operator-controller-manager-68b6c87b68-4vb9q\" (UID: \"157b7f16-11b3-452f-8c03-287aeefb2631\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.723744 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.759649 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.760991 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.764022 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6tx7j" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.765907 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-cert\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.765958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fmbv\" (UniqueName: \"kubernetes.io/projected/8683ca58-b437-4d2d-aa45-13789ebc0cb4-kube-api-access-2fmbv\") pod \"test-operator-controller-manager-56c698c775-zlnwj\" (UID: \"8683ca58-b437-4d2d-aa45-13789ebc0cb4\") " pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.765984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndg7t\" (UniqueName: \"kubernetes.io/projected/7aa296d7-2605-498a-b618-5569accc5974-kube-api-access-ndg7t\") pod \"watcher-operator-controller-manager-7794bc6bd-wfddw\" (UID: \"7aa296d7-2605-498a-b618-5569accc5974\") " pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.766067 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7hfg\" (UniqueName: \"kubernetes.io/projected/31d49be4-108d-4004-973f-0fd539dd1037-kube-api-access-h7hfg\") pod \"swift-operator-controller-manager-db6d7f97b-rmpsl\" (UID: \"31d49be4-108d-4004-973f-0fd539dd1037\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.766096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49pbc\" (UniqueName: \"kubernetes.io/projected/11418f7b-2467-49f3-ad5d-1786f70c023b-kube-api-access-49pbc\") pod \"telemetry-operator-controller-manager-76796d4c6b-dmw5m\" (UID: \"11418f7b-2467-49f3-ad5d-1786f70c023b\") " pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.774113 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.780780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc477a5-eda8-4dbd-87e8-83e4294edc0e-cert\") pod \"infra-operator-controller-manager-656bcbd775-xkw5z\" (UID: \"1fc477a5-eda8-4dbd-87e8-83e4294edc0e\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.821444 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7hfg\" (UniqueName: \"kubernetes.io/projected/31d49be4-108d-4004-973f-0fd539dd1037-kube-api-access-h7hfg\") pod \"swift-operator-controller-manager-db6d7f97b-rmpsl\" (UID: \"31d49be4-108d-4004-973f-0fd539dd1037\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.832561 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49pbc\" (UniqueName: \"kubernetes.io/projected/11418f7b-2467-49f3-ad5d-1786f70c023b-kube-api-access-49pbc\") pod \"telemetry-operator-controller-manager-76796d4c6b-dmw5m\" (UID: \"11418f7b-2467-49f3-ad5d-1786f70c023b\") " pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.832815 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.833581 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.835193 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.849423 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.849501 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.850611 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sjmqn" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.851707 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.855565 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.864498 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.867392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6qcq\" (UniqueName: \"kubernetes.io/projected/a0d2585e-cf4a-448e-8739-1591837c0781-kube-api-access-j6qcq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r\" (UID: \"a0d2585e-cf4a-448e-8739-1591837c0781\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.867534 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2b6aa63-7896-4d74-aced-bd70743b04a6-cert\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.867661 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fmbv\" (UniqueName: \"kubernetes.io/projected/8683ca58-b437-4d2d-aa45-13789ebc0cb4-kube-api-access-2fmbv\") pod \"test-operator-controller-manager-56c698c775-zlnwj\" (UID: \"8683ca58-b437-4d2d-aa45-13789ebc0cb4\") " pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.867732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndg7t\" (UniqueName: \"kubernetes.io/projected/7aa296d7-2605-498a-b618-5569accc5974-kube-api-access-ndg7t\") pod \"watcher-operator-controller-manager-7794bc6bd-wfddw\" (UID: \"7aa296d7-2605-498a-b618-5569accc5974\") " pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.867799 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvz9w\" (UniqueName: \"kubernetes.io/projected/d2b6aa63-7896-4d74-aced-bd70743b04a6-kube-api-access-dvz9w\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.876439 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.887052 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wfpjt" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.899172 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fmbv\" (UniqueName: \"kubernetes.io/projected/8683ca58-b437-4d2d-aa45-13789ebc0cb4-kube-api-access-2fmbv\") pod \"test-operator-controller-manager-56c698c775-zlnwj\" (UID: \"8683ca58-b437-4d2d-aa45-13789ebc0cb4\") " pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.899526 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndg7t\" (UniqueName: \"kubernetes.io/projected/7aa296d7-2605-498a-b618-5569accc5974-kube-api-access-ndg7t\") pod \"watcher-operator-controller-manager-7794bc6bd-wfddw\" (UID: \"7aa296d7-2605-498a-b618-5569accc5974\") " pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.920553 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.944635 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk"] Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.949773 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.986279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6qcq\" (UniqueName: \"kubernetes.io/projected/a0d2585e-cf4a-448e-8739-1591837c0781-kube-api-access-j6qcq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r\" (UID: \"a0d2585e-cf4a-448e-8739-1591837c0781\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.986331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2b6aa63-7896-4d74-aced-bd70743b04a6-cert\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:12:59 crc kubenswrapper[4735]: I1008 13:12:59.986386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvz9w\" (UniqueName: \"kubernetes.io/projected/d2b6aa63-7896-4d74-aced-bd70743b04a6-kube-api-access-dvz9w\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:12:59 crc kubenswrapper[4735]: E1008 13:12:59.986781 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 08 13:12:59 crc kubenswrapper[4735]: E1008 13:12:59.986857 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2b6aa63-7896-4d74-aced-bd70743b04a6-cert podName:d2b6aa63-7896-4d74-aced-bd70743b04a6 nodeName:}" failed. No retries permitted until 2025-10-08 13:13:00.486834203 +0000 UTC m=+1030.295138747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2b6aa63-7896-4d74-aced-bd70743b04a6-cert") pod "openstack-operator-controller-manager-8bc6b8f5b-2z94k" (UID: "d2b6aa63-7896-4d74-aced-bd70743b04a6") : secret "webhook-server-cert" not found Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.008673 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj"] Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.011798 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.038618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6qcq\" (UniqueName: \"kubernetes.io/projected/a0d2585e-cf4a-448e-8739-1591837c0781-kube-api-access-j6qcq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r\" (UID: \"a0d2585e-cf4a-448e-8739-1591837c0781\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.039195 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvz9w\" (UniqueName: \"kubernetes.io/projected/d2b6aa63-7896-4d74-aced-bd70743b04a6-kube-api-access-dvz9w\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.068795 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" event={"ID":"b5933297-7fc6-41f5-9ac0-55e3ada0e45e","Type":"ContainerStarted","Data":"aa1f4f68879889e44bfc3f357cef858b8f60b03538d0a7216965692314fb0ee0"} Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.171367 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.194541 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:13:00 crc kubenswrapper[4735]: E1008 13:13:00.194747 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 13:13:00 crc kubenswrapper[4735]: E1008 13:13:00.194797 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert podName:ad73d925-823e-437b-ba07-9125f43b98b7 nodeName:}" failed. No retries permitted until 2025-10-08 13:13:01.194782652 +0000 UTC m=+1031.003087196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert") pod "openstack-baremetal-operator-controller-manager-747747dfccm774j" (UID: "ad73d925-823e-437b-ba07-9125f43b98b7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.207105 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.499869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2b6aa63-7896-4d74-aced-bd70743b04a6-cert\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.514726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2b6aa63-7896-4d74-aced-bd70743b04a6-cert\") pod \"openstack-operator-controller-manager-8bc6b8f5b-2z94k\" (UID: \"d2b6aa63-7896-4d74-aced-bd70743b04a6\") " pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.570621 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv"] Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.579092 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4"] Oct 08 13:13:00 crc kubenswrapper[4735]: W1008 13:13:00.591197 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c811b08_74de_4fd3_a06b_5eb2021544a9.slice/crio-57235c2765f7321d8fa00df2e7f2cd86c147ed96571646af9fa7be1ac4803629 WatchSource:0}: Error finding container 57235c2765f7321d8fa00df2e7f2cd86c147ed96571646af9fa7be1ac4803629: Status 404 returned error can't find the container with id 57235c2765f7321d8fa00df2e7f2cd86c147ed96571646af9fa7be1ac4803629 Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.634756 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8"] Oct 08 13:13:00 crc kubenswrapper[4735]: W1008 13:13:00.678166 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e7b3e3_9c02_4737_b500_7a9c01d5e92a.slice/crio-8f1a79955b5200a4306a049893940ef71729d4f3cf542c1a06734b7722435807 WatchSource:0}: Error finding container 8f1a79955b5200a4306a049893940ef71729d4f3cf542c1a06734b7722435807: Status 404 returned error can't find the container with id 8f1a79955b5200a4306a049893940ef71729d4f3cf542c1a06734b7722435807 Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.790867 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.800983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9"] Oct 08 13:13:00 crc kubenswrapper[4735]: W1008 13:13:00.835992 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18146224_6500_499d_9fee_7d751700c432.slice/crio-8b0918c009f5e879539fb3834e6f005e64093b0f4d79bf7fd08eefac16bc6245 WatchSource:0}: Error finding container 8b0918c009f5e879539fb3834e6f005e64093b0f4d79bf7fd08eefac16bc6245: Status 404 returned error can't find the container with id 8b0918c009f5e879539fb3834e6f005e64093b0f4d79bf7fd08eefac16bc6245 Oct 08 13:13:00 crc kubenswrapper[4735]: W1008 13:13:00.982063 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea14d44e_0e35_42d3_8618_cc49e1888e79.slice/crio-c29515e328c3de9c3fa82f430c340915b6e4621e18abe5a7c56c2845f9b3a7a8 WatchSource:0}: Error finding container c29515e328c3de9c3fa82f430c340915b6e4621e18abe5a7c56c2845f9b3a7a8: Status 404 returned error can't find the container with id c29515e328c3de9c3fa82f430c340915b6e4621e18abe5a7c56c2845f9b3a7a8 Oct 08 13:13:00 crc kubenswrapper[4735]: I1008 13:13:00.984098 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm"] Oct 08 13:13:00 crc kubenswrapper[4735]: W1008 13:13:00.988288 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc075dd_f4c6_4185_bed2_a27ff7e638e6.slice/crio-ff4140e24b0693adda16779b7b98755d3e79de96f9f36c11f4cce94fdda0c334 WatchSource:0}: Error finding container ff4140e24b0693adda16779b7b98755d3e79de96f9f36c11f4cce94fdda0c334: Status 404 returned error can't find the container with id ff4140e24b0693adda16779b7b98755d3e79de96f9f36c11f4cce94fdda0c334 Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.005710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.092392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" event={"ID":"38e7b3e3-9c02-4737-b500-7a9c01d5e92a","Type":"ContainerStarted","Data":"8f1a79955b5200a4306a049893940ef71729d4f3cf542c1a06734b7722435807"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.095835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" event={"ID":"ea14d44e-0e35-42d3-8618-cc49e1888e79","Type":"ContainerStarted","Data":"c29515e328c3de9c3fa82f430c340915b6e4621e18abe5a7c56c2845f9b3a7a8"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.096993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" event={"ID":"4e11d5ed-b404-4449-88f2-428ee510aa3f","Type":"ContainerStarted","Data":"210047de40da02f8d828ef43163a258531b73cb46db2fab565ab17cfc49a4acf"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.098026 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" event={"ID":"0bdd70f2-2b72-4787-8434-1b3218cf5f68","Type":"ContainerStarted","Data":"bf67ef7c77d549c434cebc3b0d7a1d430adba8bc692d2192bc3419a3310592d3"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.099294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" event={"ID":"18146224-6500-499d-9fee-7d751700c432","Type":"ContainerStarted","Data":"8b0918c009f5e879539fb3834e6f005e64093b0f4d79bf7fd08eefac16bc6245"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.101781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" event={"ID":"0c811b08-74de-4fd3-a06b-5eb2021544a9","Type":"ContainerStarted","Data":"57235c2765f7321d8fa00df2e7f2cd86c147ed96571646af9fa7be1ac4803629"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.103053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" event={"ID":"9fc075dd-f4c6-4185-bed2-a27ff7e638e6","Type":"ContainerStarted","Data":"ff4140e24b0693adda16779b7b98755d3e79de96f9f36c11f4cce94fdda0c334"} Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.210930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.215017 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad73d925-823e-437b-ba07-9125f43b98b7-cert\") pod \"openstack-baremetal-operator-controller-manager-747747dfccm774j\" (UID: \"ad73d925-823e-437b-ba07-9125f43b98b7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.273054 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.280635 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.280914 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.288602 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1edbfe6_4871_4748_a468_6c6fcd2db21a.slice/crio-6d40b0f53b4ab0f8db445749caf222e6bbfa9d96e55912be48ac3d7f96d9d1c3 WatchSource:0}: Error finding container 6d40b0f53b4ab0f8db445749caf222e6bbfa9d96e55912be48ac3d7f96d9d1c3: Status 404 returned error can't find the container with id 6d40b0f53b4ab0f8db445749caf222e6bbfa9d96e55912be48ac3d7f96d9d1c3 Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.291034 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7319151_d690_4595_9b86_72b7d5b812aa.slice/crio-6cdb9072e272100fc37ed4f8f97ba248564eea75c40c717686548b8bdf67c95d WatchSource:0}: Error finding container 6cdb9072e272100fc37ed4f8f97ba248564eea75c40c717686548b8bdf67c95d: Status 404 returned error can't find the container with id 6cdb9072e272100fc37ed4f8f97ba248564eea75c40c717686548b8bdf67c95d Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.296419 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11418f7b_2467_49f3_ad5d_1786f70c023b.slice/crio-b3ff04c3701690b8e96c25b3b0057abc9d6b7aa55a67878173ab7372d7a75e7b WatchSource:0}: Error finding container b3ff04c3701690b8e96c25b3b0057abc9d6b7aa55a67878173ab7372d7a75e7b: Status 404 returned error can't find the container with id b3ff04c3701690b8e96c25b3b0057abc9d6b7aa55a67878173ab7372d7a75e7b Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.305597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.309668 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.313859 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.318432 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r"] Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.318429 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7899c8e4_5194_486e_b373_b27edda3e745.slice/crio-740dbfce6ad8e7a00f64eef8eed3e2bb8f69712ffc6adffbc739859adb8e114d WatchSource:0}: Error finding container 740dbfce6ad8e7a00f64eef8eed3e2bb8f69712ffc6adffbc739859adb8e114d: Status 404 returned error can't find the container with id 740dbfce6ad8e7a00f64eef8eed3e2bb8f69712ffc6adffbc739859adb8e114d Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.321693 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d5fbd9_81ff_4a1e_96c7_01df9c2bd91a.slice/crio-4e6519d970f5f96916bcdb22b77e44e4f70cf4488b308994a4e96a52bad6f7b3 WatchSource:0}: Error finding container 4e6519d970f5f96916bcdb22b77e44e4f70cf4488b308994a4e96a52bad6f7b3: Status 404 returned error can't find the container with id 4e6519d970f5f96916bcdb22b77e44e4f70cf4488b308994a4e96a52bad6f7b3 Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.325307 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274"] Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.332185 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8683ca58_b437_4d2d_aa45_13789ebc0cb4.slice/crio-842e359b6a9978857371092fc1c10d3e765fc5308689b455b6b420cf29321ce5 WatchSource:0}: Error finding container 842e359b6a9978857371092fc1c10d3e765fc5308689b455b6b420cf29321ce5: Status 404 returned error can't find the container with id 842e359b6a9978857371092fc1c10d3e765fc5308689b455b6b420cf29321ce5 Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.334317 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw"] Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.335907 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa296d7_2605_498a_b618_5569accc5974.slice/crio-b305a356798cea55bef0fe16190bb5ed44f6cdfd557c9cce49bd7e80dbbab7db WatchSource:0}: Error finding container b305a356798cea55bef0fe16190bb5ed44f6cdfd557c9cce49bd7e80dbbab7db: Status 404 returned error can't find the container with id b305a356798cea55bef0fe16190bb5ed44f6cdfd557c9cce49bd7e80dbbab7db Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.338370 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fmbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56c698c775-zlnwj_openstack-operators(8683ca58-b437-4d2d-aa45-13789ebc0cb4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.338989 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56c698c775-zlnwj"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.344041 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z"] Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.344456 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d09b12_6911_4d37_8210_d067b6220b73.slice/crio-864215cd73d4fb8c1621495bdca1945d3c85307ca2e77c447e3fc791482e2b08 WatchSource:0}: Error finding container 864215cd73d4fb8c1621495bdca1945d3c85307ca2e77c447e3fc791482e2b08: Status 404 returned error can't find the container with id 864215cd73d4fb8c1621495bdca1945d3c85307ca2e77c447e3fc791482e2b08 Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.346197 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdt2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5df598886f-v29rs_openstack-operators(c5d09b12-6911-4d37-8210-d067b6220b73): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.349209 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q"] Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.351805 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a5fdef5_1d5b_40d0_9675_da52d2a07d80.slice/crio-7ff7740a272cbd115eef2df50268835b0579d15124abcea490ccf3a16c00bbcd WatchSource:0}: Error finding container 7ff7740a272cbd115eef2df50268835b0579d15124abcea490ccf3a16c00bbcd: Status 404 returned error can't find the container with id 7ff7740a272cbd115eef2df50268835b0579d15124abcea490ccf3a16c00bbcd Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.353346 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl"] Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.354269 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwl5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-79d585cb66-c5274_openstack-operators(9a5fdef5-1d5b-40d0-9675-da52d2a07d80): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.357288 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-v29rs"] Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.362731 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k"] Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.373708 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc477a5_eda8_4dbd_87e8_83e4294edc0e.slice/crio-05ccfe9b2969c1f2b8fd8494df503f69b9ba41c9a239057e101a9868cc63edfe WatchSource:0}: Error finding container 05ccfe9b2969c1f2b8fd8494df503f69b9ba41c9a239057e101a9868cc63edfe: Status 404 returned error can't find the container with id 05ccfe9b2969c1f2b8fd8494df503f69b9ba41c9a239057e101a9868cc63edfe Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.375243 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d49be4_108d_4004_973f_0fd539dd1037.slice/crio-0faf745e059dc54e1575ab7decca755fe813abd0f7f1d178b9501a9a6d15b7f0 WatchSource:0}: Error finding container 0faf745e059dc54e1575ab7decca755fe813abd0f7f1d178b9501a9a6d15b7f0: Status 404 returned error can't find the container with id 0faf745e059dc54e1575ab7decca755fe813abd0f7f1d178b9501a9a6d15b7f0 Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.383246 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h7hfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-db6d7f97b-rmpsl_openstack-operators(31d49be4-108d-4004-973f-0fd539dd1037): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 13:13:01 crc kubenswrapper[4735]: W1008 13:13:01.384544 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157b7f16_11b3_452f_8c03_287aeefb2631.slice/crio-f1d72bb70e3f25ea8099928119ff3a9b0327a082c5dac9e63acd466328fd0c06 WatchSource:0}: Error finding container f1d72bb70e3f25ea8099928119ff3a9b0327a082c5dac9e63acd466328fd0c06: Status 404 returned error can't find the container with id f1d72bb70e3f25ea8099928119ff3a9b0327a082c5dac9e63acd466328fd0c06 Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.403803 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fq8lc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-68b6c87b68-4vb9q_openstack-operators(157b7f16-11b3-452f-8c03-287aeefb2631): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.623783 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" podUID="8683ca58-b437-4d2d-aa45-13789ebc0cb4" Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.665256 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" podUID="c5d09b12-6911-4d37-8210-d067b6220b73" Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.681264 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" podUID="9a5fdef5-1d5b-40d0-9675-da52d2a07d80" Oct 08 13:13:01 crc kubenswrapper[4735]: E1008 13:13:01.685622 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" podUID="157b7f16-11b3-452f-8c03-287aeefb2631" Oct 08 13:13:01 crc kubenswrapper[4735]: I1008 13:13:01.844772 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j"] Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.117442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" event={"ID":"9a5fdef5-1d5b-40d0-9675-da52d2a07d80","Type":"ContainerStarted","Data":"44e87f7bcaf54be018826f33977d8742f6e7f99245c0c5e5769c5b2ce3c28f2b"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.117734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" event={"ID":"9a5fdef5-1d5b-40d0-9675-da52d2a07d80","Type":"ContainerStarted","Data":"7ff7740a272cbd115eef2df50268835b0579d15124abcea490ccf3a16c00bbcd"} Oct 08 13:13:02 crc kubenswrapper[4735]: E1008 13:13:02.122530 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" podUID="9a5fdef5-1d5b-40d0-9675-da52d2a07d80" Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.130762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" event={"ID":"a0d2585e-cf4a-448e-8739-1591837c0781","Type":"ContainerStarted","Data":"e91d93ee48aa94b185b3fb26afcf1e82ac0aa557e269c00213a17e01354217be"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.133968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" event={"ID":"7899c8e4-5194-486e-b373-b27edda3e745","Type":"ContainerStarted","Data":"740dbfce6ad8e7a00f64eef8eed3e2bb8f69712ffc6adffbc739859adb8e114d"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.137396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" event={"ID":"1fc477a5-eda8-4dbd-87e8-83e4294edc0e","Type":"ContainerStarted","Data":"05ccfe9b2969c1f2b8fd8494df503f69b9ba41c9a239057e101a9868cc63edfe"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.141794 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" event={"ID":"7aa296d7-2605-498a-b618-5569accc5974","Type":"ContainerStarted","Data":"b305a356798cea55bef0fe16190bb5ed44f6cdfd557c9cce49bd7e80dbbab7db"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.144458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" event={"ID":"11418f7b-2467-49f3-ad5d-1786f70c023b","Type":"ContainerStarted","Data":"b3ff04c3701690b8e96c25b3b0057abc9d6b7aa55a67878173ab7372d7a75e7b"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.145623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" event={"ID":"d7319151-d690-4595-9b86-72b7d5b812aa","Type":"ContainerStarted","Data":"6cdb9072e272100fc37ed4f8f97ba248564eea75c40c717686548b8bdf67c95d"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.147069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" event={"ID":"46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a","Type":"ContainerStarted","Data":"4e6519d970f5f96916bcdb22b77e44e4f70cf4488b308994a4e96a52bad6f7b3"} Oct 08 13:13:02 crc kubenswrapper[4735]: E1008 13:13:02.158149 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" podUID="c5d09b12-6911-4d37-8210-d067b6220b73" Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.159409 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" event={"ID":"c5d09b12-6911-4d37-8210-d067b6220b73","Type":"ContainerStarted","Data":"94614707283bda6551198d97a9e31de15949b5cba8c62fdc75f8426ed95aa563"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.159513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" event={"ID":"c5d09b12-6911-4d37-8210-d067b6220b73","Type":"ContainerStarted","Data":"864215cd73d4fb8c1621495bdca1945d3c85307ca2e77c447e3fc791482e2b08"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.160011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" event={"ID":"31d49be4-108d-4004-973f-0fd539dd1037","Type":"ContainerStarted","Data":"0faf745e059dc54e1575ab7decca755fe813abd0f7f1d178b9501a9a6d15b7f0"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.162009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" event={"ID":"157b7f16-11b3-452f-8c03-287aeefb2631","Type":"ContainerStarted","Data":"5198903afb057e1ceeb1953c8a94aab6a6b87daac28709e5843a6258f786e194"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.162031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" event={"ID":"157b7f16-11b3-452f-8c03-287aeefb2631","Type":"ContainerStarted","Data":"f1d72bb70e3f25ea8099928119ff3a9b0327a082c5dac9e63acd466328fd0c06"} Oct 08 13:13:02 crc kubenswrapper[4735]: E1008 13:13:02.163417 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" podUID="157b7f16-11b3-452f-8c03-287aeefb2631" Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.163420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" event={"ID":"a1edbfe6-4871-4748-a468-6c6fcd2db21a","Type":"ContainerStarted","Data":"6d40b0f53b4ab0f8db445749caf222e6bbfa9d96e55912be48ac3d7f96d9d1c3"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.166327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" event={"ID":"d2b6aa63-7896-4d74-aced-bd70743b04a6","Type":"ContainerStarted","Data":"182d543ad42d3bd520b167790797995789deb8f4e19e695f4c5e127b87844b50"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.166389 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" event={"ID":"d2b6aa63-7896-4d74-aced-bd70743b04a6","Type":"ContainerStarted","Data":"55e538ddbf62faa3dca4e9f0cb0d78878d87f483127e33d610ecf9862f3c5f2e"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.169189 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" event={"ID":"8683ca58-b437-4d2d-aa45-13789ebc0cb4","Type":"ContainerStarted","Data":"a0f7c966e35a1a17da9f8741fbbfcceab679fcbbecb13c5cac3df815a74ea376"} Oct 08 13:13:02 crc kubenswrapper[4735]: I1008 13:13:02.169234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" event={"ID":"8683ca58-b437-4d2d-aa45-13789ebc0cb4","Type":"ContainerStarted","Data":"842e359b6a9978857371092fc1c10d3e765fc5308689b455b6b420cf29321ce5"} Oct 08 13:13:02 crc kubenswrapper[4735]: E1008 13:13:02.176189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9\\\"\"" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" podUID="8683ca58-b437-4d2d-aa45-13789ebc0cb4" Oct 08 13:13:02 crc kubenswrapper[4735]: E1008 13:13:02.879991 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" podUID="31d49be4-108d-4004-973f-0fd539dd1037" Oct 08 13:13:03 crc kubenswrapper[4735]: I1008 13:13:03.178280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" event={"ID":"31d49be4-108d-4004-973f-0fd539dd1037","Type":"ContainerStarted","Data":"a49c5c581d583986edd41b2cf667a49b96b18174ce923ae1ad1e0685b5ece875"} Oct 08 13:13:03 crc kubenswrapper[4735]: E1008 13:13:03.179670 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" podUID="31d49be4-108d-4004-973f-0fd539dd1037" Oct 08 13:13:03 crc kubenswrapper[4735]: I1008 13:13:03.182266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" event={"ID":"ad73d925-823e-437b-ba07-9125f43b98b7","Type":"ContainerStarted","Data":"c18ad17f8fe92e52164709b828d2842025e006e4f654992b19da58bc618e707c"} Oct 08 13:13:03 crc kubenswrapper[4735]: E1008 13:13:03.185308 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9\\\"\"" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" podUID="8683ca58-b437-4d2d-aa45-13789ebc0cb4" Oct 08 13:13:03 crc kubenswrapper[4735]: E1008 13:13:03.185313 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" podUID="c5d09b12-6911-4d37-8210-d067b6220b73" Oct 08 13:13:03 crc kubenswrapper[4735]: E1008 13:13:03.185349 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:33652e75a03a058769019fe8d8c51585a6eeefef5e1ecb96f9965434117954f2\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" podUID="9a5fdef5-1d5b-40d0-9675-da52d2a07d80" Oct 08 13:13:03 crc kubenswrapper[4735]: E1008 13:13:03.185356 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" podUID="157b7f16-11b3-452f-8c03-287aeefb2631" Oct 08 13:13:04 crc kubenswrapper[4735]: E1008 13:13:04.193912 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" podUID="31d49be4-108d-4004-973f-0fd539dd1037" Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.258414 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" event={"ID":"0bdd70f2-2b72-4787-8434-1b3218cf5f68","Type":"ContainerStarted","Data":"a822e7669b3878a24f8af31174ca073ffb63a45130e38c1f809d1ad98fa189f9"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.274063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" event={"ID":"38e7b3e3-9c02-4737-b500-7a9c01d5e92a","Type":"ContainerStarted","Data":"1292df0c7abfbc43c15e62c64f22959fc026d181b84d29c8154b0198bd8f58e3"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.289137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" event={"ID":"ea14d44e-0e35-42d3-8618-cc49e1888e79","Type":"ContainerStarted","Data":"8d2b74e82ab3009ac6f4bdd886e40df5b7f1706cd0bda4774d27ee37f2c97680"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.311095 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" event={"ID":"18146224-6500-499d-9fee-7d751700c432","Type":"ContainerStarted","Data":"4830e243048aced3b2c8c65ca505cd5fea329eefba86b60306e491f275ce8f5d"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.314520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" event={"ID":"1fc477a5-eda8-4dbd-87e8-83e4294edc0e","Type":"ContainerStarted","Data":"fb07b1203045e7ce647177cdf4cd085bd7464071381d95744ef92c8697a5aca4"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.334661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" event={"ID":"b5933297-7fc6-41f5-9ac0-55e3ada0e45e","Type":"ContainerStarted","Data":"b470d3d8c79eb9b0ec926afacc16e537b564b53e2edfdd04ee9609634919d853"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.343826 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" event={"ID":"11418f7b-2467-49f3-ad5d-1786f70c023b","Type":"ContainerStarted","Data":"ec1daf4aac4031fe6a264c4982c7b13235a9a6e5699fbcac86d7e53d1017e2c1"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.358135 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" event={"ID":"d7319151-d690-4595-9b86-72b7d5b812aa","Type":"ContainerStarted","Data":"0ad2a108b10b991797c25a8a1a629040fb4efd4151947f97fd2a5c7923ed349c"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.383831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" event={"ID":"ad73d925-823e-437b-ba07-9125f43b98b7","Type":"ContainerStarted","Data":"70ccdb10d2da58ae96f707a66d6cc8671144fa022261ccbc2ccd1daaac7334d2"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.437059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" event={"ID":"4e11d5ed-b404-4449-88f2-428ee510aa3f","Type":"ContainerStarted","Data":"fc6d3207f610fd4b24d4a965a1cd65c286603a01ef8fd8066af7dbc01c85239d"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.560126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" event={"ID":"d2b6aa63-7896-4d74-aced-bd70743b04a6","Type":"ContainerStarted","Data":"7ea841a797833fc86ad2d8e8a2263fff9493e131dbbeb1d127b554f7db0b7b14"} Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.562090 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.580357 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" Oct 08 13:13:12 crc kubenswrapper[4735]: I1008 13:13:12.741426 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-8bc6b8f5b-2z94k" podStartSLOduration=13.741403403 podStartE2EDuration="13.741403403s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:13:12.718056213 +0000 UTC m=+1042.526360757" watchObservedRunningTime="2025-10-08 13:13:12.741403403 +0000 UTC m=+1042.549707947" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.573037 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" event={"ID":"d7319151-d690-4595-9b86-72b7d5b812aa","Type":"ContainerStarted","Data":"c69330d9c80c0dad43a9ce23f5c56f4907608bc09ac53945cb17c7ec2c367f71"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.573439 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.575087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" event={"ID":"46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a","Type":"ContainerStarted","Data":"107f1162c0622f95b2dd8c17eb4f49540de9962a854540c363533cee89d08d71"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.575116 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" event={"ID":"46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a","Type":"ContainerStarted","Data":"586b5fd065b549bd8714cc337dbc02657008cc515fa20b060d9837938b57e1ef"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.575157 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.576611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" event={"ID":"0c811b08-74de-4fd3-a06b-5eb2021544a9","Type":"ContainerStarted","Data":"59e0fa02264e91ee5f57dc6168cabf7ebf98a535a77dc0d70744e5f88e3f20c0"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.578380 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" event={"ID":"0bdd70f2-2b72-4787-8434-1b3218cf5f68","Type":"ContainerStarted","Data":"b8e3b6b9924e8dce08af1331774304fe1c34c5427e843c0683f9be06340e778e"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.579166 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.581132 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" event={"ID":"1fc477a5-eda8-4dbd-87e8-83e4294edc0e","Type":"ContainerStarted","Data":"db06877f8693021cbd7280c479dabd070bb071f477fe984ced56b6a9076a7861"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.581829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.584010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" event={"ID":"7aa296d7-2605-498a-b618-5569accc5974","Type":"ContainerStarted","Data":"df390fc538cfa044dc07f6b271af2d4dde024689ccbb3a862e7114ca8c506d64"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.585366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" event={"ID":"a0d2585e-cf4a-448e-8739-1591837c0781","Type":"ContainerStarted","Data":"e560ccb44f5b79f46b26214751757b73e2975773c863fd621d4b53c4e7dca65e"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.590402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" event={"ID":"4e11d5ed-b404-4449-88f2-428ee510aa3f","Type":"ContainerStarted","Data":"e610c565b0dc47716ca282f4ad84547091abfd10a9a4c5290886899e6134f465"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.591094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.593012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" event={"ID":"ad73d925-823e-437b-ba07-9125f43b98b7","Type":"ContainerStarted","Data":"6887ece9738418102a7cd50f2a34fbe0b937d18b776db3d9d8bf3931e6028b12"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.593635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.596530 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" event={"ID":"b5933297-7fc6-41f5-9ac0-55e3ada0e45e","Type":"ContainerStarted","Data":"e279c6f0c7df26d75065b50502a4f3e21661b08473fc513e7b8641a185e7834e"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.597083 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.599637 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" event={"ID":"9fc075dd-f4c6-4185-bed2-a27ff7e638e6","Type":"ContainerStarted","Data":"55422de38932d8fc7d1e671e4ae75df4a9710ea0a263e8ac6561455dd4200090"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.601840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" event={"ID":"38e7b3e3-9c02-4737-b500-7a9c01d5e92a","Type":"ContainerStarted","Data":"64838808d0e5c98686efa20f492c4957dac8d4badeb04d3f4234959ffa0e2cf7"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.602427 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.603927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" event={"ID":"7899c8e4-5194-486e-b373-b27edda3e745","Type":"ContainerStarted","Data":"441bf170f59de36bb239239f2b70f64105462d727738540aa4e50c8e8a0ba2e2"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.605577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" event={"ID":"18146224-6500-499d-9fee-7d751700c432","Type":"ContainerStarted","Data":"78e52ccd591975148c0645c4c554820491da129194471e1fa83526dbe92ecfe1"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.606108 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.607838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" event={"ID":"a1edbfe6-4871-4748-a468-6c6fcd2db21a","Type":"ContainerStarted","Data":"394b88a6b03cbc983dfefb0811a4e0468eb5be345d69d4153d34fb2fa49c7598"} Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.658998 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" podStartSLOduration=7.164557974 podStartE2EDuration="15.658980588s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.082293052 +0000 UTC m=+1029.890597596" lastFinishedPulling="2025-10-08 13:13:08.576715666 +0000 UTC m=+1038.385020210" observedRunningTime="2025-10-08 13:13:13.657610714 +0000 UTC m=+1043.465915258" watchObservedRunningTime="2025-10-08 13:13:13.658980588 +0000 UTC m=+1043.467285132" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.661283 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" podStartSLOduration=4.37181688 podStartE2EDuration="14.661277759s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.309338531 +0000 UTC m=+1031.117643075" lastFinishedPulling="2025-10-08 13:13:11.59879941 +0000 UTC m=+1041.407103954" observedRunningTime="2025-10-08 13:13:13.626817972 +0000 UTC m=+1043.435122516" watchObservedRunningTime="2025-10-08 13:13:13.661277759 +0000 UTC m=+1043.469582303" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.737688 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" podStartSLOduration=6.747418366 podStartE2EDuration="15.737668893s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.839085682 +0000 UTC m=+1030.647390226" lastFinishedPulling="2025-10-08 13:13:09.829336209 +0000 UTC m=+1039.637640753" observedRunningTime="2025-10-08 13:13:13.694245439 +0000 UTC m=+1043.502549983" watchObservedRunningTime="2025-10-08 13:13:13.737668893 +0000 UTC m=+1043.545973437" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.741043 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" podStartSLOduration=7.537516115 podStartE2EDuration="15.741031422s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:12:59.950337021 +0000 UTC m=+1029.758641565" lastFinishedPulling="2025-10-08 13:13:08.153852328 +0000 UTC m=+1037.962156872" observedRunningTime="2025-10-08 13:13:13.736053084 +0000 UTC m=+1043.544357628" watchObservedRunningTime="2025-10-08 13:13:13.741031422 +0000 UTC m=+1043.549335956" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.779055 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r" podStartSLOduration=4.49676077 podStartE2EDuration="14.779033051s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.336334148 +0000 UTC m=+1031.144638692" lastFinishedPulling="2025-10-08 13:13:11.618606429 +0000 UTC m=+1041.426910973" observedRunningTime="2025-10-08 13:13:13.774670154 +0000 UTC m=+1043.582974708" watchObservedRunningTime="2025-10-08 13:13:13.779033051 +0000 UTC m=+1043.587337605" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.818797 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" podStartSLOduration=5.655711266 podStartE2EDuration="15.81877681s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.40358216 +0000 UTC m=+1031.211886704" lastFinishedPulling="2025-10-08 13:13:11.566647704 +0000 UTC m=+1041.374952248" observedRunningTime="2025-10-08 13:13:13.813883624 +0000 UTC m=+1043.622188168" watchObservedRunningTime="2025-10-08 13:13:13.81877681 +0000 UTC m=+1043.627081364" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.844453 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" podStartSLOduration=5.001667844 podStartE2EDuration="15.844436931s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.680536055 +0000 UTC m=+1030.488840599" lastFinishedPulling="2025-10-08 13:13:11.523305142 +0000 UTC m=+1041.331609686" observedRunningTime="2025-10-08 13:13:13.842550398 +0000 UTC m=+1043.650854942" watchObservedRunningTime="2025-10-08 13:13:13.844436931 +0000 UTC m=+1043.652741475" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.863629 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" podStartSLOduration=4.567771228 podStartE2EDuration="14.863609879s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.323380398 +0000 UTC m=+1031.131684942" lastFinishedPulling="2025-10-08 13:13:11.619219049 +0000 UTC m=+1041.427523593" observedRunningTime="2025-10-08 13:13:13.860925222 +0000 UTC m=+1043.669229776" watchObservedRunningTime="2025-10-08 13:13:13.863609879 +0000 UTC m=+1043.671914423" Oct 08 13:13:13 crc kubenswrapper[4735]: I1008 13:13:13.896462 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" podStartSLOduration=6.050002406 podStartE2EDuration="14.896443336s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:02.76942785 +0000 UTC m=+1032.577732394" lastFinishedPulling="2025-10-08 13:13:11.61586878 +0000 UTC m=+1041.424173324" observedRunningTime="2025-10-08 13:13:13.894832448 +0000 UTC m=+1043.703137002" watchObservedRunningTime="2025-10-08 13:13:13.896443336 +0000 UTC m=+1043.704747870" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.615564 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" event={"ID":"11418f7b-2467-49f3-ad5d-1786f70c023b","Type":"ContainerStarted","Data":"99aedca4a34961470b3365dbeba92163d8f9ca5214d9cd9bc0420c91220718e8"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.616446 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.618185 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" event={"ID":"ea14d44e-0e35-42d3-8618-cc49e1888e79","Type":"ContainerStarted","Data":"769da42e1c21f632112fe1095467c133de7b0d7cb1dabd086722044d20052f29"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.618405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.620256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" event={"ID":"7899c8e4-5194-486e-b373-b27edda3e745","Type":"ContainerStarted","Data":"0bae3c913a8bf0212ae32d156795e6600e1c8724bbe50f4736622023d91307b7"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.620281 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.621961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" event={"ID":"0c811b08-74de-4fd3-a06b-5eb2021544a9","Type":"ContainerStarted","Data":"c9d9493137262c4f39125c6249dc7329593d76f02ea8846fe76a2bbf6666c04c"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.622094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.623663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" event={"ID":"a1edbfe6-4871-4748-a468-6c6fcd2db21a","Type":"ContainerStarted","Data":"b1d625a3dff9c9f0c713eede1266c7d2a9f5753effe6e237e0f35226ae5a753e"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.623823 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.625260 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" event={"ID":"7aa296d7-2605-498a-b618-5569accc5974","Type":"ContainerStarted","Data":"d8a9b3512fa89dd1859c96d9efc7ea788bfac3b22706e69956069e33af8a8854"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.625518 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.626959 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" event={"ID":"9fc075dd-f4c6-4185-bed2-a27ff7e638e6","Type":"ContainerStarted","Data":"63ce84c9a03c37fc9c50d8a8f87aaef2a13c0b46713734c89834ac6961dad550"} Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.629085 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.632525 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" podStartSLOduration=7.415430047 podStartE2EDuration="16.632460717s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.612228767 +0000 UTC m=+1030.420533311" lastFinishedPulling="2025-10-08 13:13:09.829259447 +0000 UTC m=+1039.637563981" observedRunningTime="2025-10-08 13:13:13.92734759 +0000 UTC m=+1043.735652134" watchObservedRunningTime="2025-10-08 13:13:14.632460717 +0000 UTC m=+1044.440765261" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.635267 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" podStartSLOduration=5.325777589 podStartE2EDuration="15.635256576s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.309834274 +0000 UTC m=+1031.118138818" lastFinishedPulling="2025-10-08 13:13:11.619313261 +0000 UTC m=+1041.427617805" observedRunningTime="2025-10-08 13:13:14.631630912 +0000 UTC m=+1044.439935456" watchObservedRunningTime="2025-10-08 13:13:14.635256576 +0000 UTC m=+1044.443561110" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.647893 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" podStartSLOduration=6.405713469 podStartE2EDuration="16.647874028s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.32304683 +0000 UTC m=+1031.131351374" lastFinishedPulling="2025-10-08 13:13:11.565207389 +0000 UTC m=+1041.373511933" observedRunningTime="2025-10-08 13:13:14.642617275 +0000 UTC m=+1044.450921819" watchObservedRunningTime="2025-10-08 13:13:14.647874028 +0000 UTC m=+1044.456178572" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.657422 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" podStartSLOduration=5.048460162 podStartE2EDuration="15.657400446s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.995340353 +0000 UTC m=+1030.803644887" lastFinishedPulling="2025-10-08 13:13:11.604280627 +0000 UTC m=+1041.412585171" observedRunningTime="2025-10-08 13:13:14.655222217 +0000 UTC m=+1044.463526761" watchObservedRunningTime="2025-10-08 13:13:14.657400446 +0000 UTC m=+1044.465704990" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.691116 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" podStartSLOduration=5.718081885 podStartE2EDuration="16.691099629s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.594118839 +0000 UTC m=+1030.402423373" lastFinishedPulling="2025-10-08 13:13:11.567136563 +0000 UTC m=+1041.375441117" observedRunningTime="2025-10-08 13:13:14.689429769 +0000 UTC m=+1044.497734313" watchObservedRunningTime="2025-10-08 13:13:14.691099629 +0000 UTC m=+1044.499404173" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.722102 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" podStartSLOduration=6.151772681 podStartE2EDuration="16.722080584s" podCreationTimestamp="2025-10-08 13:12:58 +0000 UTC" firstStartedPulling="2025-10-08 13:13:00.995420035 +0000 UTC m=+1030.803724579" lastFinishedPulling="2025-10-08 13:13:11.565727918 +0000 UTC m=+1041.374032482" observedRunningTime="2025-10-08 13:13:14.721802219 +0000 UTC m=+1044.530106763" watchObservedRunningTime="2025-10-08 13:13:14.722080584 +0000 UTC m=+1044.530385128" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.751995 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" podStartSLOduration=5.460071152 podStartE2EDuration="15.75197309s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.337981769 +0000 UTC m=+1031.146286313" lastFinishedPulling="2025-10-08 13:13:11.629883707 +0000 UTC m=+1041.438188251" observedRunningTime="2025-10-08 13:13:14.745077128 +0000 UTC m=+1044.553381672" watchObservedRunningTime="2025-10-08 13:13:14.75197309 +0000 UTC m=+1044.560277634" Oct 08 13:13:14 crc kubenswrapper[4735]: I1008 13:13:14.762032 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" podStartSLOduration=5.473004638 podStartE2EDuration="15.762015566s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.30966769 +0000 UTC m=+1031.117972234" lastFinishedPulling="2025-10-08 13:13:11.598678618 +0000 UTC m=+1041.406983162" observedRunningTime="2025-10-08 13:13:14.757922994 +0000 UTC m=+1044.566227538" watchObservedRunningTime="2025-10-08 13:13:14.762015566 +0000 UTC m=+1044.570320110" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.170540 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-wvpgj" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.195232 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-9rbsk" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.248516 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-rzjz4" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.255604 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-bsvbv" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.301787 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-vbvt8" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.441380 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-6jbp9" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.473092 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wm2bm" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.578620 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-r786m" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.598334 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-522n9" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.643264 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-98fgg" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.671301 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-7k9wv" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.726273 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-gnl2m" Oct 08 13:13:19 crc kubenswrapper[4735]: I1008 13:13:19.924711 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-dmw5m" Oct 08 13:13:20 crc kubenswrapper[4735]: I1008 13:13:20.018366 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-xkw5z" Oct 08 13:13:20 crc kubenswrapper[4735]: I1008 13:13:20.173979 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-wfddw" Oct 08 13:13:21 crc kubenswrapper[4735]: I1008 13:13:21.287413 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-747747dfccm774j" Oct 08 13:13:24 crc kubenswrapper[4735]: I1008 13:13:24.878915 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:13:24 crc kubenswrapper[4735]: I1008 13:13:24.879725 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:13:24 crc kubenswrapper[4735]: I1008 13:13:24.879791 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:13:24 crc kubenswrapper[4735]: I1008 13:13:24.880668 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"933a14f7e22905eb9791da68c8e9bb68f5475b7d23363338a177f03a39161d6c"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:13:24 crc kubenswrapper[4735]: I1008 13:13:24.880725 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://933a14f7e22905eb9791da68c8e9bb68f5475b7d23363338a177f03a39161d6c" gracePeriod=600 Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.732883 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" event={"ID":"8683ca58-b437-4d2d-aa45-13789ebc0cb4","Type":"ContainerStarted","Data":"36d920cf910729d4f26f9a95af7c9cd439d4d02892a88d8f7b207f89da0de6ed"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.733415 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.734858 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" event={"ID":"c5d09b12-6911-4d37-8210-d067b6220b73","Type":"ContainerStarted","Data":"2c822ee4e0e72aa606b775b0387fb4e066f32f034acd869c78885f8480630a8c"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.735015 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.736461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" event={"ID":"9a5fdef5-1d5b-40d0-9675-da52d2a07d80","Type":"ContainerStarted","Data":"fa002393461707a1f0f9451817797c245d08f8fcbd44b4af4e527c800a215165"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.736654 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.738051 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" event={"ID":"31d49be4-108d-4004-973f-0fd539dd1037","Type":"ContainerStarted","Data":"fbe574e33f0d2479686772088382d1f490e7b12f339d8bf12c19a89838ee0db8"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.738646 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.741250 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="933a14f7e22905eb9791da68c8e9bb68f5475b7d23363338a177f03a39161d6c" exitCode=0 Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.741398 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"933a14f7e22905eb9791da68c8e9bb68f5475b7d23363338a177f03a39161d6c"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.741561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"9f7b3e9d44554f3c931f516096c49ab670e3d36bde045f3e21679319bf30124a"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.741623 4735 scope.go:117] "RemoveContainer" containerID="33eef7770cb538234e6d07e0c65f9624f3726c56e48ea2376d339214ceee54ee" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.744178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" event={"ID":"157b7f16-11b3-452f-8c03-287aeefb2631","Type":"ContainerStarted","Data":"21711d9e6c3f4a053877c53ce52010756b1a06a144cfdabeebcaf4b1364d18a7"} Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.744635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.764340 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" podStartSLOduration=3.5766591549999998 podStartE2EDuration="26.76432009s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.338236215 +0000 UTC m=+1031.146540749" lastFinishedPulling="2025-10-08 13:13:24.52589712 +0000 UTC m=+1054.334201684" observedRunningTime="2025-10-08 13:13:25.763662458 +0000 UTC m=+1055.571966992" watchObservedRunningTime="2025-10-08 13:13:25.76432009 +0000 UTC m=+1055.572624634" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.791819 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" podStartSLOduration=3.620105953 podStartE2EDuration="26.791795723s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.354160909 +0000 UTC m=+1031.162465453" lastFinishedPulling="2025-10-08 13:13:24.525850679 +0000 UTC m=+1054.334155223" observedRunningTime="2025-10-08 13:13:25.782621712 +0000 UTC m=+1055.590926276" watchObservedRunningTime="2025-10-08 13:13:25.791795723 +0000 UTC m=+1055.600100277" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.804457 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" podStartSLOduration=3.572235362 podStartE2EDuration="26.804434355s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.345993737 +0000 UTC m=+1031.154298281" lastFinishedPulling="2025-10-08 13:13:24.5781927 +0000 UTC m=+1054.386497274" observedRunningTime="2025-10-08 13:13:25.798269577 +0000 UTC m=+1055.606574131" watchObservedRunningTime="2025-10-08 13:13:25.804434355 +0000 UTC m=+1055.612738899" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.821165 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" podStartSLOduration=3.64658041 podStartE2EDuration="26.821146649s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.403618901 +0000 UTC m=+1031.211923445" lastFinishedPulling="2025-10-08 13:13:24.57818513 +0000 UTC m=+1054.386489684" observedRunningTime="2025-10-08 13:13:25.817837421 +0000 UTC m=+1055.626141965" watchObservedRunningTime="2025-10-08 13:13:25.821146649 +0000 UTC m=+1055.629451193" Oct 08 13:13:25 crc kubenswrapper[4735]: I1008 13:13:25.850384 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" podStartSLOduration=3.7086632980000003 podStartE2EDuration="26.850363444s" podCreationTimestamp="2025-10-08 13:12:59 +0000 UTC" firstStartedPulling="2025-10-08 13:13:01.383047053 +0000 UTC m=+1031.191351597" lastFinishedPulling="2025-10-08 13:13:24.524747199 +0000 UTC m=+1054.333051743" observedRunningTime="2025-10-08 13:13:25.849264514 +0000 UTC m=+1055.657569058" watchObservedRunningTime="2025-10-08 13:13:25.850363444 +0000 UTC m=+1055.658667988" Oct 08 13:13:29 crc kubenswrapper[4735]: I1008 13:13:29.618930 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-c5274" Oct 08 13:13:29 crc kubenswrapper[4735]: I1008 13:13:29.646535 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-v29rs" Oct 08 13:13:29 crc kubenswrapper[4735]: I1008 13:13:29.836459 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-4vb9q" Oct 08 13:13:29 crc kubenswrapper[4735]: I1008 13:13:29.879031 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-rmpsl" Oct 08 13:13:29 crc kubenswrapper[4735]: I1008 13:13:29.952443 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56c698c775-zlnwj" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.548085 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-hxfqv"] Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.551469 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.553859 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5pxwj" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.554501 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.554784 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.556236 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.559652 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-hxfqv"] Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.568230 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1792d89-af88-4a9a-b874-e8e6e7655c04-config\") pod \"dnsmasq-dns-7bfcb9d745-hxfqv\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.568455 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67h52\" (UniqueName: \"kubernetes.io/projected/a1792d89-af88-4a9a-b874-e8e6e7655c04-kube-api-access-67h52\") pod \"dnsmasq-dns-7bfcb9d745-hxfqv\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.600158 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-dx9nh"] Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.601393 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.607547 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.613724 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-dx9nh"] Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.670324 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.670413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67h52\" (UniqueName: \"kubernetes.io/projected/a1792d89-af88-4a9a-b874-e8e6e7655c04-kube-api-access-67h52\") pod \"dnsmasq-dns-7bfcb9d745-hxfqv\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.670445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj7qw\" (UniqueName: \"kubernetes.io/projected/978b94d0-ee60-4ff0-9721-4d24ac634aa4-kube-api-access-dj7qw\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.670488 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1792d89-af88-4a9a-b874-e8e6e7655c04-config\") pod \"dnsmasq-dns-7bfcb9d745-hxfqv\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.670591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-config\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.671342 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1792d89-af88-4a9a-b874-e8e6e7655c04-config\") pod \"dnsmasq-dns-7bfcb9d745-hxfqv\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.694600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67h52\" (UniqueName: \"kubernetes.io/projected/a1792d89-af88-4a9a-b874-e8e6e7655c04-kube-api-access-67h52\") pod \"dnsmasq-dns-7bfcb9d745-hxfqv\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.771856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj7qw\" (UniqueName: \"kubernetes.io/projected/978b94d0-ee60-4ff0-9721-4d24ac634aa4-kube-api-access-dj7qw\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.771934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-config\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.771989 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.772947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-config\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.772992 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.795448 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj7qw\" (UniqueName: \"kubernetes.io/projected/978b94d0-ee60-4ff0-9721-4d24ac634aa4-kube-api-access-dj7qw\") pod \"dnsmasq-dns-758b79db4c-dx9nh\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.870750 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:13:44 crc kubenswrapper[4735]: I1008 13:13:44.922770 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:13:45 crc kubenswrapper[4735]: I1008 13:13:45.289865 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-hxfqv"] Oct 08 13:13:45 crc kubenswrapper[4735]: W1008 13:13:45.290746 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1792d89_af88_4a9a_b874_e8e6e7655c04.slice/crio-08892270b6261b9ab40dda7beb57a1d0fdb9540cb72e6a98ad4c65d77b8f0699 WatchSource:0}: Error finding container 08892270b6261b9ab40dda7beb57a1d0fdb9540cb72e6a98ad4c65d77b8f0699: Status 404 returned error can't find the container with id 08892270b6261b9ab40dda7beb57a1d0fdb9540cb72e6a98ad4c65d77b8f0699 Oct 08 13:13:45 crc kubenswrapper[4735]: I1008 13:13:45.292882 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:13:45 crc kubenswrapper[4735]: I1008 13:13:45.404845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-dx9nh"] Oct 08 13:13:45 crc kubenswrapper[4735]: W1008 13:13:45.410763 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978b94d0_ee60_4ff0_9721_4d24ac634aa4.slice/crio-e3e165e2a4f663b0912240fe60a2f8d61b24888b4009a023d158c0fe2c8ed640 WatchSource:0}: Error finding container e3e165e2a4f663b0912240fe60a2f8d61b24888b4009a023d158c0fe2c8ed640: Status 404 returned error can't find the container with id e3e165e2a4f663b0912240fe60a2f8d61b24888b4009a023d158c0fe2c8ed640 Oct 08 13:13:45 crc kubenswrapper[4735]: I1008 13:13:45.881192 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" event={"ID":"978b94d0-ee60-4ff0-9721-4d24ac634aa4","Type":"ContainerStarted","Data":"e3e165e2a4f663b0912240fe60a2f8d61b24888b4009a023d158c0fe2c8ed640"} Oct 08 13:13:45 crc kubenswrapper[4735]: I1008 13:13:45.883852 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" event={"ID":"a1792d89-af88-4a9a-b874-e8e6e7655c04","Type":"ContainerStarted","Data":"08892270b6261b9ab40dda7beb57a1d0fdb9540cb72e6a98ad4c65d77b8f0699"} Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.608675 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-dx9nh"] Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.645358 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644597f84c-48kkw"] Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.647990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.660283 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-48kkw"] Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.707275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-dns-svc\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.707725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdzrt\" (UniqueName: \"kubernetes.io/projected/191f3e26-da78-4391-9f78-66d43c758300-kube-api-access-zdzrt\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.707768 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-config\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.816388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzrt\" (UniqueName: \"kubernetes.io/projected/191f3e26-da78-4391-9f78-66d43c758300-kube-api-access-zdzrt\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.816812 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-config\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.816910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-dns-svc\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.818324 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-dns-svc\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.819438 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-config\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.852015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzrt\" (UniqueName: \"kubernetes.io/projected/191f3e26-da78-4391-9f78-66d43c758300-kube-api-access-zdzrt\") pod \"dnsmasq-dns-644597f84c-48kkw\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.944313 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-hxfqv"] Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.985007 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77597f887-vdnj6"] Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.986265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:46 crc kubenswrapper[4735]: I1008 13:13:46.992298 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-vdnj6"] Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.019356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.024059 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-dns-svc\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.024222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-config\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.024321 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2b9\" (UniqueName: \"kubernetes.io/projected/dbf29341-130d-4282-b4a2-31e745b9a6c1-kube-api-access-jk2b9\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.149202 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-dns-svc\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.149310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-config\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.149391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2b9\" (UniqueName: \"kubernetes.io/projected/dbf29341-130d-4282-b4a2-31e745b9a6c1-kube-api-access-jk2b9\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.151035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-config\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.158503 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-dns-svc\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.196284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2b9\" (UniqueName: \"kubernetes.io/projected/dbf29341-130d-4282-b4a2-31e745b9a6c1-kube-api-access-jk2b9\") pod \"dnsmasq-dns-77597f887-vdnj6\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.313544 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.534456 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-48kkw"] Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.796761 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.798386 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.803751 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.804114 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.804439 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.804631 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.805325 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j8cjm" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.805629 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.806124 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.807972 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861560 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82v4\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-kube-api-access-t82v4\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861629 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861664 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/644f0099-ec9c-42af-bbe8-54fe5b603bb3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/644f0099-ec9c-42af-bbe8-54fe5b603bb3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861714 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861730 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861757 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861783 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.861818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.878013 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-vdnj6"] Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.909673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-48kkw" event={"ID":"191f3e26-da78-4391-9f78-66d43c758300","Type":"ContainerStarted","Data":"8998e17d1790056024c38212245730a5bbc2323a51af687e5f84a2ff3342368e"} Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.917546 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-vdnj6" event={"ID":"dbf29341-130d-4282-b4a2-31e745b9a6c1","Type":"ContainerStarted","Data":"161d99a3851bb0887a50470038609e001f1a71ebd91f66f1ac98ba8cbc5846f6"} Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.963933 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.963994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964039 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82v4\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-kube-api-access-t82v4\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964201 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/644f0099-ec9c-42af-bbe8-54fe5b603bb3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964240 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/644f0099-ec9c-42af-bbe8-54fe5b603bb3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964264 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964436 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.964669 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.965209 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.965543 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.965562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.966077 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.971427 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/644f0099-ec9c-42af-bbe8-54fe5b603bb3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.972678 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/644f0099-ec9c-42af-bbe8-54fe5b603bb3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.973355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.973930 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:47 crc kubenswrapper[4735]: I1008 13:13:47.995721 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82v4\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-kube-api-access-t82v4\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:47.999864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " pod="openstack/rabbitmq-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.191949 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.209395 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.210694 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.213545 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tx7g2" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.216680 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.219011 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.219160 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.219318 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.219519 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.219653 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.233808 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ad2819-cfc3-4416-8a27-ecb65c2fa366-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268755 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ad2819-cfc3-4416-8a27-ecb65c2fa366-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268890 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268937 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.268966 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.269000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.269025 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkx52\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-kube-api-access-bkx52\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.369924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ad2819-cfc3-4416-8a27-ecb65c2fa366-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.369987 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370034 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370102 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ad2819-cfc3-4416-8a27-ecb65c2fa366-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370149 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkx52\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-kube-api-access-bkx52\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370347 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370684 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.370976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.373265 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.374253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.375095 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.378492 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ad2819-cfc3-4416-8a27-ecb65c2fa366-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.378663 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ad2819-cfc3-4416-8a27-ecb65c2fa366-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.378774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.379179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.387270 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkx52\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-kube-api-access-bkx52\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.395207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.543739 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.675315 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:13:48 crc kubenswrapper[4735]: W1008 13:13:48.684576 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod644f0099_ec9c_42af_bbe8_54fe5b603bb3.slice/crio-25bf65159126ff4da297a51df6ff72a110c317913e73192603856a8dc3503a86 WatchSource:0}: Error finding container 25bf65159126ff4da297a51df6ff72a110c317913e73192603856a8dc3503a86: Status 404 returned error can't find the container with id 25bf65159126ff4da297a51df6ff72a110c317913e73192603856a8dc3503a86 Oct 08 13:13:48 crc kubenswrapper[4735]: I1008 13:13:48.937027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"644f0099-ec9c-42af-bbe8-54fe5b603bb3","Type":"ContainerStarted","Data":"25bf65159126ff4da297a51df6ff72a110c317913e73192603856a8dc3503a86"} Oct 08 13:13:49 crc kubenswrapper[4735]: I1008 13:13:49.069746 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.921556 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.938414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.951390 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.953194 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.953316 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.954932 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-w8hmv" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.954939 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.959212 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.960016 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.967356 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.968924 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.972134 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2lcgz" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.972341 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.972538 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.972715 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 13:13:50 crc kubenswrapper[4735]: I1008 13:13:50.974280 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.116836 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.116874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.116897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.116930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbl4\" (UniqueName: \"kubernetes.io/projected/f2428d97-aa22-413f-ab58-5b077fdde95c-kube-api-access-8kbl4\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117255 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117369 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qpb\" (UniqueName: \"kubernetes.io/projected/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kube-api-access-75qpb\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117621 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117723 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117775 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117799 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-secrets\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.117852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.218813 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.218872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.218899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.218924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.218978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbl4\" (UniqueName: \"kubernetes.io/projected/f2428d97-aa22-413f-ab58-5b077fdde95c-kube-api-access-8kbl4\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219042 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219174 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qpb\" (UniqueName: \"kubernetes.io/projected/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kube-api-access-75qpb\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219318 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-secrets\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219389 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219671 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219934 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.219949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.220062 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.220328 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.220935 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.221605 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.221843 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.222645 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.228070 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.234335 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.234613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.234876 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.236612 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.238190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-secrets\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.250735 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qpb\" (UniqueName: \"kubernetes.io/projected/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kube-api-access-75qpb\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.266594 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbl4\" (UniqueName: \"kubernetes.io/projected/f2428d97-aa22-413f-ab58-5b077fdde95c-kube-api-access-8kbl4\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.279504 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.293239 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.382238 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.383880 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.388955 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.389387 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.389542 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-5lvkn" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.414556 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.524893 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-config-data\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.525033 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbrb\" (UniqueName: \"kubernetes.io/projected/f6927451-d206-4948-a43e-5b4ee59dddcc-kube-api-access-2qbrb\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.525097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-kolla-config\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.525191 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.525227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.570231 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.595042 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.628660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-config-data\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.629709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-config-data\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.631306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbrb\" (UniqueName: \"kubernetes.io/projected/f6927451-d206-4948-a43e-5b4ee59dddcc-kube-api-access-2qbrb\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.631379 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-kolla-config\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.631529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.631604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.632066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-kolla-config\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.637102 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.637511 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.652894 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbrb\" (UniqueName: \"kubernetes.io/projected/f6927451-d206-4948-a43e-5b4ee59dddcc-kube-api-access-2qbrb\") pod \"memcached-0\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " pod="openstack/memcached-0" Oct 08 13:13:51 crc kubenswrapper[4735]: I1008 13:13:51.722264 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.167514 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.168846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.171444 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9xstt" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.176891 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.264745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x86hz\" (UniqueName: \"kubernetes.io/projected/add5914b-96e1-4d52-8648-d3429ea33b0d-kube-api-access-x86hz\") pod \"kube-state-metrics-0\" (UID: \"add5914b-96e1-4d52-8648-d3429ea33b0d\") " pod="openstack/kube-state-metrics-0" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.366020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x86hz\" (UniqueName: \"kubernetes.io/projected/add5914b-96e1-4d52-8648-d3429ea33b0d-kube-api-access-x86hz\") pod \"kube-state-metrics-0\" (UID: \"add5914b-96e1-4d52-8648-d3429ea33b0d\") " pod="openstack/kube-state-metrics-0" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.385346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x86hz\" (UniqueName: \"kubernetes.io/projected/add5914b-96e1-4d52-8648-d3429ea33b0d-kube-api-access-x86hz\") pod \"kube-state-metrics-0\" (UID: \"add5914b-96e1-4d52-8648-d3429ea33b0d\") " pod="openstack/kube-state-metrics-0" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.519720 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:13:53 crc kubenswrapper[4735]: I1008 13:13:53.999525 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ad2819-cfc3-4416-8a27-ecb65c2fa366","Type":"ContainerStarted","Data":"b9112f5686bf87b44a22cb2a26b7ecaabb68baa979c8ffde3f8994321551ccf2"} Oct 08 13:13:56 crc kubenswrapper[4735]: I1008 13:13:56.948666 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pbkkt"] Oct 08 13:13:56 crc kubenswrapper[4735]: I1008 13:13:56.950450 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:56 crc kubenswrapper[4735]: I1008 13:13:56.955078 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8b5q8" Oct 08 13:13:56 crc kubenswrapper[4735]: I1008 13:13:56.955111 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 13:13:56 crc kubenswrapper[4735]: I1008 13:13:56.955326 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 08 13:13:56 crc kubenswrapper[4735]: I1008 13:13:56.958767 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbkkt"] Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.032188 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9frwj"] Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.034167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run-ovn\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036730 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73b19a4f-4794-49fc-bfd9-727618f11e7c-scripts\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-combined-ca-bundle\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036831 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqbpj\" (UniqueName: \"kubernetes.io/projected/73b19a4f-4794-49fc-bfd9-727618f11e7c-kube-api-access-mqbpj\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036871 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-ovn-controller-tls-certs\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.036898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-log-ovn\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.040299 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9frwj"] Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.138867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-scripts\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.138924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run-ovn\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.138966 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.138998 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73b19a4f-4794-49fc-bfd9-727618f11e7c-scripts\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-lib\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139073 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-etc-ovs\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139105 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-run\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139170 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-log\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-combined-ca-bundle\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqbpj\" (UniqueName: \"kubernetes.io/projected/73b19a4f-4794-49fc-bfd9-727618f11e7c-kube-api-access-mqbpj\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-ovn-controller-tls-certs\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-log-ovn\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq95g\" (UniqueName: \"kubernetes.io/projected/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-kube-api-access-nq95g\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139569 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run-ovn\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.139714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.140865 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-log-ovn\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.142069 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73b19a4f-4794-49fc-bfd9-727618f11e7c-scripts\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.145036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-ovn-controller-tls-certs\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.145917 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-combined-ca-bundle\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.161150 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqbpj\" (UniqueName: \"kubernetes.io/projected/73b19a4f-4794-49fc-bfd9-727618f11e7c-kube-api-access-mqbpj\") pod \"ovn-controller-pbkkt\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq95g\" (UniqueName: \"kubernetes.io/projected/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-kube-api-access-nq95g\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-scripts\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-lib\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-etc-ovs\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240632 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-run\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-log\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240773 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-run\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-log\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-lib\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.240894 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-etc-ovs\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.243633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-scripts\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.262152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq95g\" (UniqueName: \"kubernetes.io/projected/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-kube-api-access-nq95g\") pod \"ovn-controller-ovs-9frwj\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.281834 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.361413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.834065 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.835238 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.848562 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.848839 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.849832 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.849972 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-d85cp" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.850155 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.869761 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952621 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952717 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952783 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952817 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdv8k\" (UniqueName: \"kubernetes.io/projected/52f1329e-e0dc-4295-8dfc-92fae02065f2-kube-api-access-pdv8k\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952854 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-config\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:57 crc kubenswrapper[4735]: I1008 13:13:57.952879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.054610 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.054981 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdv8k\" (UniqueName: \"kubernetes.io/projected/52f1329e-e0dc-4295-8dfc-92fae02065f2-kube-api-access-pdv8k\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055628 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-config\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055777 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.055798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.056751 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.056813 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.057593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-config\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.062307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.064229 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.070653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.075230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdv8k\" (UniqueName: \"kubernetes.io/projected/52f1329e-e0dc-4295-8dfc-92fae02065f2-kube-api-access-pdv8k\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.094764 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " pod="openstack/ovsdbserver-nb-0" Oct 08 13:13:58 crc kubenswrapper[4735]: I1008 13:13:58.156828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.243892 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.246412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.249777 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.249855 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.249984 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.251011 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7vpfr" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.257176 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.403965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-config\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404419 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404524 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404688 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d852b\" (UniqueName: \"kubernetes.io/projected/647cfc89-264d-4616-ab81-64a7c9aa51c5-kube-api-access-d852b\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.404823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.505979 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506284 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-config\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506707 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506829 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.507105 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d852b\" (UniqueName: \"kubernetes.io/projected/647cfc89-264d-4616-ab81-64a7c9aa51c5-kube-api-access-d852b\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.507318 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-config\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.506715 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.507626 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.507996 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.510707 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.517397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.517683 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.527969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d852b\" (UniqueName: \"kubernetes.io/projected/647cfc89-264d-4616-ab81-64a7c9aa51c5-kube-api-access-d852b\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.537037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:00 crc kubenswrapper[4735]: I1008 13:14:00.576766 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:01 crc kubenswrapper[4735]: E1008 13:14:01.814075 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 13:14:01 crc kubenswrapper[4735]: E1008 13:14:01.814489 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67h52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bfcb9d745-hxfqv_openstack(a1792d89-af88-4a9a-b874-e8e6e7655c04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 13:14:01 crc kubenswrapper[4735]: E1008 13:14:01.816117 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" podUID="a1792d89-af88-4a9a-b874-e8e6e7655c04" Oct 08 13:14:02 crc kubenswrapper[4735]: I1008 13:14:02.089021 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 13:14:07 crc kubenswrapper[4735]: W1008 13:14:07.141029 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6927451_d206_4948_a43e_5b4ee59dddcc.slice/crio-fe3d5c22dfc05728732f5d67d080c8fac6dee5ca6f6a923e86129ac6a4e105a0 WatchSource:0}: Error finding container fe3d5c22dfc05728732f5d67d080c8fac6dee5ca6f6a923e86129ac6a4e105a0: Status 404 returned error can't find the container with id fe3d5c22dfc05728732f5d67d080c8fac6dee5ca6f6a923e86129ac6a4e105a0 Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.358271 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.525136 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1792d89-af88-4a9a-b874-e8e6e7655c04-config\") pod \"a1792d89-af88-4a9a-b874-e8e6e7655c04\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.525263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67h52\" (UniqueName: \"kubernetes.io/projected/a1792d89-af88-4a9a-b874-e8e6e7655c04-kube-api-access-67h52\") pod \"a1792d89-af88-4a9a-b874-e8e6e7655c04\" (UID: \"a1792d89-af88-4a9a-b874-e8e6e7655c04\") " Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.525820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1792d89-af88-4a9a-b874-e8e6e7655c04-config" (OuterVolumeSpecName: "config") pod "a1792d89-af88-4a9a-b874-e8e6e7655c04" (UID: "a1792d89-af88-4a9a-b874-e8e6e7655c04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.549646 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1792d89-af88-4a9a-b874-e8e6e7655c04-kube-api-access-67h52" (OuterVolumeSpecName: "kube-api-access-67h52") pod "a1792d89-af88-4a9a-b874-e8e6e7655c04" (UID: "a1792d89-af88-4a9a-b874-e8e6e7655c04"). InnerVolumeSpecName "kube-api-access-67h52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.626974 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1792d89-af88-4a9a-b874-e8e6e7655c04-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.627285 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67h52\" (UniqueName: \"kubernetes.io/projected/a1792d89-af88-4a9a-b874-e8e6e7655c04-kube-api-access-67h52\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.822313 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 13:14:07 crc kubenswrapper[4735]: W1008 13:14:07.823610 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2868ea84_fef0_4728_9a62_e2ba97a9aae1.slice/crio-f8cf2dd4fa24e9e5269420c461c2aa5154e09910f6f203aae725440d3928dc0e WatchSource:0}: Error finding container f8cf2dd4fa24e9e5269420c461c2aa5154e09910f6f203aae725440d3928dc0e: Status 404 returned error can't find the container with id f8cf2dd4fa24e9e5269420c461c2aa5154e09910f6f203aae725440d3928dc0e Oct 08 13:14:07 crc kubenswrapper[4735]: W1008 13:14:07.825562 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2428d97_aa22_413f_ab58_5b077fdde95c.slice/crio-4c4be7c93ac7be1aaa6951deaf23df23d213e01d8665f74d96003e4f40f780f6 WatchSource:0}: Error finding container 4c4be7c93ac7be1aaa6951deaf23df23d213e01d8665f74d96003e4f40f780f6: Status 404 returned error can't find the container with id 4c4be7c93ac7be1aaa6951deaf23df23d213e01d8665f74d96003e4f40f780f6 Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.834278 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 13:14:07 crc kubenswrapper[4735]: I1008 13:14:07.927513 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.010659 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:14:08 crc kubenswrapper[4735]: W1008 13:14:08.021085 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd5914b_96e1_4d52_8648_d3429ea33b0d.slice/crio-f8cbee69405022076c2306e8dec490c25b0c31b087e1c6072aa140749ef931b3 WatchSource:0}: Error finding container f8cbee69405022076c2306e8dec490c25b0c31b087e1c6072aa140749ef931b3: Status 404 returned error can't find the container with id f8cbee69405022076c2306e8dec490c25b0c31b087e1c6072aa140749ef931b3 Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.022446 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbkkt"] Oct 08 13:14:08 crc kubenswrapper[4735]: W1008 13:14:08.032732 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d3f8ec_fa4f_4d33_8a33_eac3edce38eb.slice/crio-f22c7151017feeaffeab02436b73af72e45d237482cf4e79a60f12d816e2c1a8 WatchSource:0}: Error finding container f22c7151017feeaffeab02436b73af72e45d237482cf4e79a60f12d816e2c1a8: Status 404 returned error can't find the container with id f22c7151017feeaffeab02436b73af72e45d237482cf4e79a60f12d816e2c1a8 Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.036582 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9frwj"] Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.134654 4735 generic.go:334] "Generic (PLEG): container finished" podID="191f3e26-da78-4391-9f78-66d43c758300" containerID="ed9a0ec35414313bdd77fb53fa3b07a54a99e37a423c0797dc28dc3531a1a0b9" exitCode=0 Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.134757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-48kkw" event={"ID":"191f3e26-da78-4391-9f78-66d43c758300","Type":"ContainerDied","Data":"ed9a0ec35414313bdd77fb53fa3b07a54a99e37a423c0797dc28dc3531a1a0b9"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.136580 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt" event={"ID":"73b19a4f-4794-49fc-bfd9-727618f11e7c","Type":"ContainerStarted","Data":"6000cbfb80846d28ff664946af0c6e79f1d4bfc29b423770d8f89142dd6fe3d3"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.145802 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f6927451-d206-4948-a43e-5b4ee59dddcc","Type":"ContainerStarted","Data":"fe3d5c22dfc05728732f5d67d080c8fac6dee5ca6f6a923e86129ac6a4e105a0"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.151011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2868ea84-fef0-4728-9a62-e2ba97a9aae1","Type":"ContainerStarted","Data":"f8cf2dd4fa24e9e5269420c461c2aa5154e09910f6f203aae725440d3928dc0e"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.155114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2428d97-aa22-413f-ab58-5b077fdde95c","Type":"ContainerStarted","Data":"4c4be7c93ac7be1aaa6951deaf23df23d213e01d8665f74d96003e4f40f780f6"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.158161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerStarted","Data":"f22c7151017feeaffeab02436b73af72e45d237482cf4e79a60f12d816e2c1a8"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.161784 4735 generic.go:334] "Generic (PLEG): container finished" podID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerID="90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92" exitCode=0 Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.161856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-vdnj6" event={"ID":"dbf29341-130d-4282-b4a2-31e745b9a6c1","Type":"ContainerDied","Data":"90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.168758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"add5914b-96e1-4d52-8648-d3429ea33b0d","Type":"ContainerStarted","Data":"f8cbee69405022076c2306e8dec490c25b0c31b087e1c6072aa140749ef931b3"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.171753 4735 generic.go:334] "Generic (PLEG): container finished" podID="978b94d0-ee60-4ff0-9721-4d24ac634aa4" containerID="047a38c482d06bdf87e6a9dc5d333ab61ff3745f089602a2281d59131e7862cb" exitCode=0 Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.172093 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" event={"ID":"978b94d0-ee60-4ff0-9721-4d24ac634aa4","Type":"ContainerDied","Data":"047a38c482d06bdf87e6a9dc5d333ab61ff3745f089602a2281d59131e7862cb"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.174207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"647cfc89-264d-4616-ab81-64a7c9aa51c5","Type":"ContainerStarted","Data":"61736668a98b5785470a3aacd4c66220bb840a6df0f0287741176be813cc958a"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.175265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" event={"ID":"a1792d89-af88-4a9a-b874-e8e6e7655c04","Type":"ContainerDied","Data":"08892270b6261b9ab40dda7beb57a1d0fdb9540cb72e6a98ad4c65d77b8f0699"} Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.175334 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-hxfqv" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.430724 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-hxfqv"] Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.442974 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-hxfqv"] Oct 08 13:14:08 crc kubenswrapper[4735]: E1008 13:14:08.474235 4735 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 08 13:14:08 crc kubenswrapper[4735]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/191f3e26-da78-4391-9f78-66d43c758300/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 08 13:14:08 crc kubenswrapper[4735]: > podSandboxID="8998e17d1790056024c38212245730a5bbc2323a51af687e5f84a2ff3342368e" Oct 08 13:14:08 crc kubenswrapper[4735]: E1008 13:14:08.474705 4735 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 08 13:14:08 crc kubenswrapper[4735]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdzrt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-644597f84c-48kkw_openstack(191f3e26-da78-4391-9f78-66d43c758300): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/191f3e26-da78-4391-9f78-66d43c758300/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 08 13:14:08 crc kubenswrapper[4735]: > logger="UnhandledError" Oct 08 13:14:08 crc kubenswrapper[4735]: E1008 13:14:08.475866 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/191f3e26-da78-4391-9f78-66d43c758300/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-644597f84c-48kkw" podUID="191f3e26-da78-4391-9f78-66d43c758300" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.687835 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.734746 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.845595 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-config\") pod \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.845788 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-dns-svc\") pod \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.845911 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj7qw\" (UniqueName: \"kubernetes.io/projected/978b94d0-ee60-4ff0-9721-4d24ac634aa4-kube-api-access-dj7qw\") pod \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\" (UID: \"978b94d0-ee60-4ff0-9721-4d24ac634aa4\") " Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.851444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978b94d0-ee60-4ff0-9721-4d24ac634aa4-kube-api-access-dj7qw" (OuterVolumeSpecName: "kube-api-access-dj7qw") pod "978b94d0-ee60-4ff0-9721-4d24ac634aa4" (UID: "978b94d0-ee60-4ff0-9721-4d24ac634aa4"). InnerVolumeSpecName "kube-api-access-dj7qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.863539 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-config" (OuterVolumeSpecName: "config") pod "978b94d0-ee60-4ff0-9721-4d24ac634aa4" (UID: "978b94d0-ee60-4ff0-9721-4d24ac634aa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.866289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "978b94d0-ee60-4ff0-9721-4d24ac634aa4" (UID: "978b94d0-ee60-4ff0-9721-4d24ac634aa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.905934 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1792d89-af88-4a9a-b874-e8e6e7655c04" path="/var/lib/kubelet/pods/a1792d89-af88-4a9a-b874-e8e6e7655c04/volumes" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.947379 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.947409 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj7qw\" (UniqueName: \"kubernetes.io/projected/978b94d0-ee60-4ff0-9721-4d24ac634aa4-kube-api-access-dj7qw\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:08 crc kubenswrapper[4735]: I1008 13:14:08.947421 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978b94d0-ee60-4ff0-9721-4d24ac634aa4-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.221934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" event={"ID":"978b94d0-ee60-4ff0-9721-4d24ac634aa4","Type":"ContainerDied","Data":"e3e165e2a4f663b0912240fe60a2f8d61b24888b4009a023d158c0fe2c8ed640"} Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.222011 4735 scope.go:117] "RemoveContainer" containerID="047a38c482d06bdf87e6a9dc5d333ab61ff3745f089602a2281d59131e7862cb" Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.222231 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-dx9nh" Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.228628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-vdnj6" event={"ID":"dbf29341-130d-4282-b4a2-31e745b9a6c1","Type":"ContainerStarted","Data":"de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338"} Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.230059 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.230773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ad2819-cfc3-4416-8a27-ecb65c2fa366","Type":"ContainerStarted","Data":"8b5489cc563b906768d49040d0dc9d8ee683e0a1a8c2ad6e11328dcf2b18d053"} Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.238956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"644f0099-ec9c-42af-bbe8-54fe5b603bb3","Type":"ContainerStarted","Data":"08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0"} Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.261848 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-dx9nh"] Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.267297 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-dx9nh"] Oct 08 13:14:09 crc kubenswrapper[4735]: I1008 13:14:09.283274 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77597f887-vdnj6" podStartSLOduration=3.904275153 podStartE2EDuration="23.283253087s" podCreationTimestamp="2025-10-08 13:13:46 +0000 UTC" firstStartedPulling="2025-10-08 13:13:47.90226573 +0000 UTC m=+1077.710570274" lastFinishedPulling="2025-10-08 13:14:07.281243664 +0000 UTC m=+1097.089548208" observedRunningTime="2025-10-08 13:14:09.283213596 +0000 UTC m=+1099.091518150" watchObservedRunningTime="2025-10-08 13:14:09.283253087 +0000 UTC m=+1099.091557631" Oct 08 13:14:10 crc kubenswrapper[4735]: I1008 13:14:10.251574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-48kkw" event={"ID":"191f3e26-da78-4391-9f78-66d43c758300","Type":"ContainerStarted","Data":"73687dbb1afae726f63785f4eedd0b7d3f36f962797b6ecf31c728cc3d14c0b8"} Oct 08 13:14:10 crc kubenswrapper[4735]: I1008 13:14:10.254152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f1329e-e0dc-4295-8dfc-92fae02065f2","Type":"ContainerStarted","Data":"0bb524772fddde9f18801bfeaa6d2817ae97294471e32ba726b187a9ff636401"} Oct 08 13:14:10 crc kubenswrapper[4735]: I1008 13:14:10.272781 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644597f84c-48kkw" podStartSLOduration=4.569093723 podStartE2EDuration="24.272754452s" podCreationTimestamp="2025-10-08 13:13:46 +0000 UTC" firstStartedPulling="2025-10-08 13:13:47.554744566 +0000 UTC m=+1077.363049110" lastFinishedPulling="2025-10-08 13:14:07.258405295 +0000 UTC m=+1097.066709839" observedRunningTime="2025-10-08 13:14:10.268836971 +0000 UTC m=+1100.077141535" watchObservedRunningTime="2025-10-08 13:14:10.272754452 +0000 UTC m=+1100.081058996" Oct 08 13:14:10 crc kubenswrapper[4735]: I1008 13:14:10.905931 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978b94d0-ee60-4ff0-9721-4d24ac634aa4" path="/var/lib/kubelet/pods/978b94d0-ee60-4ff0-9721-4d24ac634aa4/volumes" Oct 08 13:14:12 crc kubenswrapper[4735]: I1008 13:14:12.019776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.303584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2428d97-aa22-413f-ab58-5b077fdde95c","Type":"ContainerStarted","Data":"57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.305292 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerID="679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee" exitCode=0 Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.305406 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerDied","Data":"679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.307100 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt" event={"ID":"73b19a4f-4794-49fc-bfd9-727618f11e7c","Type":"ContainerStarted","Data":"d30b6d8ff48994cf5b030b6578cb0bfd0a226d0320a05c48eaa37fda8c089985"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.307335 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-pbkkt" Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.309474 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f6927451-d206-4948-a43e-5b4ee59dddcc","Type":"ContainerStarted","Data":"9fd6286f631f71ba317c16f921eb7e0bc5df5010bec0980750f899b1396c8b6e"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.310042 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.312867 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"add5914b-96e1-4d52-8648-d3429ea33b0d","Type":"ContainerStarted","Data":"9ee54bf2aa7fa2d2788207107b556c20ed7c52eb424aeccd9ddf2b2c0d2f7eb9"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.313237 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.315951 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2868ea84-fef0-4728-9a62-e2ba97a9aae1","Type":"ContainerStarted","Data":"7a5a1a14e53c9fb322ae90dae5d7e280ecbb3b21e0fceb0275319ed5fb89b446"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.318374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f1329e-e0dc-4295-8dfc-92fae02065f2","Type":"ContainerStarted","Data":"57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.328887 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"647cfc89-264d-4616-ab81-64a7c9aa51c5","Type":"ContainerStarted","Data":"a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341"} Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.365921 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pbkkt" podStartSLOduration=13.684971078 podStartE2EDuration="20.365903356s" podCreationTimestamp="2025-10-08 13:13:56 +0000 UTC" firstStartedPulling="2025-10-08 13:14:08.02761647 +0000 UTC m=+1097.835921014" lastFinishedPulling="2025-10-08 13:14:14.708548748 +0000 UTC m=+1104.516853292" observedRunningTime="2025-10-08 13:14:16.361966452 +0000 UTC m=+1106.170271006" watchObservedRunningTime="2025-10-08 13:14:16.365903356 +0000 UTC m=+1106.174207900" Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.417416 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.137268046 podStartE2EDuration="23.417397625s" podCreationTimestamp="2025-10-08 13:13:53 +0000 UTC" firstStartedPulling="2025-10-08 13:14:08.023827492 +0000 UTC m=+1097.832132036" lastFinishedPulling="2025-10-08 13:14:15.303957061 +0000 UTC m=+1105.112261615" observedRunningTime="2025-10-08 13:14:16.403990051 +0000 UTC m=+1106.212294615" watchObservedRunningTime="2025-10-08 13:14:16.417397625 +0000 UTC m=+1106.225702169" Oct 08 13:14:16 crc kubenswrapper[4735]: I1008 13:14:16.430036 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.98841494 podStartE2EDuration="25.430013568s" podCreationTimestamp="2025-10-08 13:13:51 +0000 UTC" firstStartedPulling="2025-10-08 13:14:07.187896457 +0000 UTC m=+1096.996201001" lastFinishedPulling="2025-10-08 13:14:09.629495085 +0000 UTC m=+1099.437799629" observedRunningTime="2025-10-08 13:14:16.422191242 +0000 UTC m=+1106.230495796" watchObservedRunningTime="2025-10-08 13:14:16.430013568 +0000 UTC m=+1106.238318122" Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.021626 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.315707 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.361598 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-48kkw"] Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.380610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerStarted","Data":"dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773"} Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.380670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerStarted","Data":"0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883"} Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.380690 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.380904 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.381077 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644597f84c-48kkw" podUID="191f3e26-da78-4391-9f78-66d43c758300" containerName="dnsmasq-dns" containerID="cri-o://73687dbb1afae726f63785f4eedd0b7d3f36f962797b6ecf31c728cc3d14c0b8" gracePeriod=10 Oct 08 13:14:17 crc kubenswrapper[4735]: I1008 13:14:17.410022 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9frwj" podStartSLOduration=13.809745485 podStartE2EDuration="20.410001111s" podCreationTimestamp="2025-10-08 13:13:57 +0000 UTC" firstStartedPulling="2025-10-08 13:14:08.039347552 +0000 UTC m=+1097.847652096" lastFinishedPulling="2025-10-08 13:14:14.639603178 +0000 UTC m=+1104.447907722" observedRunningTime="2025-10-08 13:14:17.408599414 +0000 UTC m=+1107.216903958" watchObservedRunningTime="2025-10-08 13:14:17.410001111 +0000 UTC m=+1107.218305655" Oct 08 13:14:18 crc kubenswrapper[4735]: I1008 13:14:18.400507 4735 generic.go:334] "Generic (PLEG): container finished" podID="191f3e26-da78-4391-9f78-66d43c758300" containerID="73687dbb1afae726f63785f4eedd0b7d3f36f962797b6ecf31c728cc3d14c0b8" exitCode=0 Oct 08 13:14:18 crc kubenswrapper[4735]: I1008 13:14:18.403125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-48kkw" event={"ID":"191f3e26-da78-4391-9f78-66d43c758300","Type":"ContainerDied","Data":"73687dbb1afae726f63785f4eedd0b7d3f36f962797b6ecf31c728cc3d14c0b8"} Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.284860 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.409847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f1329e-e0dc-4295-8dfc-92fae02065f2","Type":"ContainerStarted","Data":"72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b"} Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.412293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"647cfc89-264d-4616-ab81-64a7c9aa51c5","Type":"ContainerStarted","Data":"a762e323d1f3da2b5f91ff63a2787457f4dc57f2b638bc08715406fafaabc87e"} Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.423629 4735 generic.go:334] "Generic (PLEG): container finished" podID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerID="57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746" exitCode=0 Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.423714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2428d97-aa22-413f-ab58-5b077fdde95c","Type":"ContainerDied","Data":"57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746"} Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.425771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-48kkw" event={"ID":"191f3e26-da78-4391-9f78-66d43c758300","Type":"ContainerDied","Data":"8998e17d1790056024c38212245730a5bbc2323a51af687e5f84a2ff3342368e"} Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.425816 4735 scope.go:117] "RemoveContainer" containerID="73687dbb1afae726f63785f4eedd0b7d3f36f962797b6ecf31c728cc3d14c0b8" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.425936 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-48kkw" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.427720 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.696875077 podStartE2EDuration="23.427712351s" podCreationTimestamp="2025-10-08 13:13:56 +0000 UTC" firstStartedPulling="2025-10-08 13:14:09.425740741 +0000 UTC m=+1099.234045285" lastFinishedPulling="2025-10-08 13:14:19.156578015 +0000 UTC m=+1108.964882559" observedRunningTime="2025-10-08 13:14:19.426690954 +0000 UTC m=+1109.234995498" watchObservedRunningTime="2025-10-08 13:14:19.427712351 +0000 UTC m=+1109.236016895" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.430498 4735 generic.go:334] "Generic (PLEG): container finished" podID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerID="7a5a1a14e53c9fb322ae90dae5d7e280ecbb3b21e0fceb0275319ed5fb89b446" exitCode=0 Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.431412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2868ea84-fef0-4728-9a62-e2ba97a9aae1","Type":"ContainerDied","Data":"7a5a1a14e53c9fb322ae90dae5d7e280ecbb3b21e0fceb0275319ed5fb89b446"} Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.451499 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdzrt\" (UniqueName: \"kubernetes.io/projected/191f3e26-da78-4391-9f78-66d43c758300-kube-api-access-zdzrt\") pod \"191f3e26-da78-4391-9f78-66d43c758300\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.451583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-config\") pod \"191f3e26-da78-4391-9f78-66d43c758300\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.451715 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-dns-svc\") pod \"191f3e26-da78-4391-9f78-66d43c758300\" (UID: \"191f3e26-da78-4391-9f78-66d43c758300\") " Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.454342 4735 scope.go:117] "RemoveContainer" containerID="ed9a0ec35414313bdd77fb53fa3b07a54a99e37a423c0797dc28dc3531a1a0b9" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.458558 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191f3e26-da78-4391-9f78-66d43c758300-kube-api-access-zdzrt" (OuterVolumeSpecName: "kube-api-access-zdzrt") pod "191f3e26-da78-4391-9f78-66d43c758300" (UID: "191f3e26-da78-4391-9f78-66d43c758300"). InnerVolumeSpecName "kube-api-access-zdzrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.481752 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.321241607 podStartE2EDuration="20.481708616s" podCreationTimestamp="2025-10-08 13:13:59 +0000 UTC" firstStartedPulling="2025-10-08 13:14:07.980667099 +0000 UTC m=+1097.788971653" lastFinishedPulling="2025-10-08 13:14:19.141134118 +0000 UTC m=+1108.949438662" observedRunningTime="2025-10-08 13:14:19.478264865 +0000 UTC m=+1109.286569409" watchObservedRunningTime="2025-10-08 13:14:19.481708616 +0000 UTC m=+1109.290013160" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.504680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-config" (OuterVolumeSpecName: "config") pod "191f3e26-da78-4391-9f78-66d43c758300" (UID: "191f3e26-da78-4391-9f78-66d43c758300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.515428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "191f3e26-da78-4391-9f78-66d43c758300" (UID: "191f3e26-da78-4391-9f78-66d43c758300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.553812 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.553856 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdzrt\" (UniqueName: \"kubernetes.io/projected/191f3e26-da78-4391-9f78-66d43c758300-kube-api-access-zdzrt\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.553877 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191f3e26-da78-4391-9f78-66d43c758300-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.752318 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-48kkw"] Oct 08 13:14:19 crc kubenswrapper[4735]: I1008 13:14:19.759943 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-48kkw"] Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.365701 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xjdpv"] Oct 08 13:14:20 crc kubenswrapper[4735]: E1008 13:14:20.366067 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191f3e26-da78-4391-9f78-66d43c758300" containerName="init" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.366089 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="191f3e26-da78-4391-9f78-66d43c758300" containerName="init" Oct 08 13:14:20 crc kubenswrapper[4735]: E1008 13:14:20.366116 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191f3e26-da78-4391-9f78-66d43c758300" containerName="dnsmasq-dns" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.366124 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="191f3e26-da78-4391-9f78-66d43c758300" containerName="dnsmasq-dns" Oct 08 13:14:20 crc kubenswrapper[4735]: E1008 13:14:20.366141 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978b94d0-ee60-4ff0-9721-4d24ac634aa4" containerName="init" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.366149 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="978b94d0-ee60-4ff0-9721-4d24ac634aa4" containerName="init" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.366330 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="978b94d0-ee60-4ff0-9721-4d24ac634aa4" containerName="init" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.366355 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="191f3e26-da78-4391-9f78-66d43c758300" containerName="dnsmasq-dns" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.367014 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.369018 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.382947 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xjdpv"] Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.439844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2428d97-aa22-413f-ab58-5b077fdde95c","Type":"ContainerStarted","Data":"d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52"} Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.445839 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2868ea84-fef0-4728-9a62-e2ba97a9aae1","Type":"ContainerStarted","Data":"c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e"} Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.466267 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovs-rundir\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.466333 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovn-rundir\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.466358 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-combined-ca-bundle\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.466382 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnsl\" (UniqueName: \"kubernetes.io/projected/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-kube-api-access-tqnsl\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.466407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.466423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-config\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.473426 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.495152309 podStartE2EDuration="31.473408187s" podCreationTimestamp="2025-10-08 13:13:49 +0000 UTC" firstStartedPulling="2025-10-08 13:14:07.834642024 +0000 UTC m=+1097.642946578" lastFinishedPulling="2025-10-08 13:14:14.812897912 +0000 UTC m=+1104.621202456" observedRunningTime="2025-10-08 13:14:20.467332036 +0000 UTC m=+1110.275636580" watchObservedRunningTime="2025-10-08 13:14:20.473408187 +0000 UTC m=+1110.281712731" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.523190 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.167808064 podStartE2EDuration="31.52316594s" podCreationTimestamp="2025-10-08 13:13:49 +0000 UTC" firstStartedPulling="2025-10-08 13:14:07.828226159 +0000 UTC m=+1097.636530723" lastFinishedPulling="2025-10-08 13:14:15.183584045 +0000 UTC m=+1104.991888599" observedRunningTime="2025-10-08 13:14:20.493458276 +0000 UTC m=+1110.301762840" watchObservedRunningTime="2025-10-08 13:14:20.52316594 +0000 UTC m=+1110.331470484" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.525715 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-msdx9"] Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.530448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.533191 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.533193 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-msdx9"] Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.570363 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovn-rundir\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.570431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-combined-ca-bundle\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.570450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnsl\" (UniqueName: \"kubernetes.io/projected/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-kube-api-access-tqnsl\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.570509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.570533 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-config\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.571349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovs-rundir\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.571823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovs-rundir\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.576619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovn-rundir\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.577162 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.579143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-config\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.581232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.590751 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-combined-ca-bundle\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.602423 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnsl\" (UniqueName: \"kubernetes.io/projected/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-kube-api-access-tqnsl\") pod \"ovn-controller-metrics-xjdpv\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.650539 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-msdx9"] Oct 08 13:14:20 crc kubenswrapper[4735]: E1008 13:14:20.651181 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-l9nrc ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-f6b595d95-msdx9" podUID="61266427-4479-4527-95ef-6f2d04a29192" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.664461 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-g2qzr"] Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.665869 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.670563 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.673044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-config\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.673088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9nrc\" (UniqueName: \"kubernetes.io/projected/61266427-4479-4527-95ef-6f2d04a29192-kube-api-access-l9nrc\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.673122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-dns-svc\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.673170 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-ovsdbserver-nb\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.677927 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-g2qzr"] Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.681279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.774967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmtjv\" (UniqueName: \"kubernetes.io/projected/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-kube-api-access-zmtjv\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775169 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775214 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-config\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9nrc\" (UniqueName: \"kubernetes.io/projected/61266427-4479-4527-95ef-6f2d04a29192-kube-api-access-l9nrc\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-dns-svc\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-config\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.775386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-ovsdbserver-nb\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.776224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-ovsdbserver-nb\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.776237 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-config\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.776264 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-dns-svc\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.801791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9nrc\" (UniqueName: \"kubernetes.io/projected/61266427-4479-4527-95ef-6f2d04a29192-kube-api-access-l9nrc\") pod \"dnsmasq-dns-f6b595d95-msdx9\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.876972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmtjv\" (UniqueName: \"kubernetes.io/projected/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-kube-api-access-zmtjv\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.877308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.877366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.877388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.877457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-config\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.878548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-config\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.878698 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.879226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.879807 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.891925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmtjv\" (UniqueName: \"kubernetes.io/projected/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-kube-api-access-zmtjv\") pod \"dnsmasq-dns-dc9d58d7-g2qzr\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.915211 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="191f3e26-da78-4391-9f78-66d43c758300" path="/var/lib/kubelet/pods/191f3e26-da78-4391-9f78-66d43c758300/volumes" Oct 08 13:14:20 crc kubenswrapper[4735]: I1008 13:14:20.985322 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.170428 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xjdpv"] Oct 08 13:14:21 crc kubenswrapper[4735]: W1008 13:14:21.177807 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e294ddd_2861_4cfb_97b5_d4283cc37e4e.slice/crio-16ea78a34a39e88b5828bc6215331316866ec19e379f102c1f6c454c239b9353 WatchSource:0}: Error finding container 16ea78a34a39e88b5828bc6215331316866ec19e379f102c1f6c454c239b9353: Status 404 returned error can't find the container with id 16ea78a34a39e88b5828bc6215331316866ec19e379f102c1f6c454c239b9353 Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.456019 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.456015 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xjdpv" event={"ID":"3e294ddd-2861-4cfb-97b5-d4283cc37e4e","Type":"ContainerStarted","Data":"6876d906c666c75cba8b23303f73e98ddec735188d7cc7cbc4fd832d9f9fa680"} Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.456306 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xjdpv" event={"ID":"3e294ddd-2861-4cfb-97b5-d4283cc37e4e","Type":"ContainerStarted","Data":"16ea78a34a39e88b5828bc6215331316866ec19e379f102c1f6c454c239b9353"} Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.467487 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.479232 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xjdpv" podStartSLOduration=1.4792117710000001 podStartE2EDuration="1.479211771s" podCreationTimestamp="2025-10-08 13:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:14:21.470733877 +0000 UTC m=+1111.279038431" watchObservedRunningTime="2025-10-08 13:14:21.479211771 +0000 UTC m=+1111.287516315" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.505423 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-g2qzr"] Oct 08 13:14:21 crc kubenswrapper[4735]: W1008 13:14:21.512085 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaadfd0e2_e652_4aa4_850c_8fef23ae29ea.slice/crio-b02d8741a9d306b30a2c226205fbe2c59b1793d69f6ef2e91bc5ccc1f1c1245a WatchSource:0}: Error finding container b02d8741a9d306b30a2c226205fbe2c59b1793d69f6ef2e91bc5ccc1f1c1245a: Status 404 returned error can't find the container with id b02d8741a9d306b30a2c226205fbe2c59b1793d69f6ef2e91bc5ccc1f1c1245a Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.571308 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.571365 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.576914 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.586079 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-ovsdbserver-nb\") pod \"61266427-4479-4527-95ef-6f2d04a29192\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.586173 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-dns-svc\") pod \"61266427-4479-4527-95ef-6f2d04a29192\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.586213 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-config\") pod \"61266427-4479-4527-95ef-6f2d04a29192\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.586300 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9nrc\" (UniqueName: \"kubernetes.io/projected/61266427-4479-4527-95ef-6f2d04a29192-kube-api-access-l9nrc\") pod \"61266427-4479-4527-95ef-6f2d04a29192\" (UID: \"61266427-4479-4527-95ef-6f2d04a29192\") " Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.588383 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61266427-4479-4527-95ef-6f2d04a29192" (UID: "61266427-4479-4527-95ef-6f2d04a29192"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.588871 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-config" (OuterVolumeSpecName: "config") pod "61266427-4479-4527-95ef-6f2d04a29192" (UID: "61266427-4479-4527-95ef-6f2d04a29192"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.588940 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61266427-4479-4527-95ef-6f2d04a29192" (UID: "61266427-4479-4527-95ef-6f2d04a29192"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.595253 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.595296 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.596123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61266427-4479-4527-95ef-6f2d04a29192-kube-api-access-l9nrc" (OuterVolumeSpecName: "kube-api-access-l9nrc") pod "61266427-4479-4527-95ef-6f2d04a29192" (UID: "61266427-4479-4527-95ef-6f2d04a29192"). InnerVolumeSpecName "kube-api-access-l9nrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.620783 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.688183 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.688214 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.688223 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61266427-4479-4527-95ef-6f2d04a29192-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.688232 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9nrc\" (UniqueName: \"kubernetes.io/projected/61266427-4479-4527-95ef-6f2d04a29192-kube-api-access-l9nrc\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:21 crc kubenswrapper[4735]: I1008 13:14:21.724047 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.157875 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.201777 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.465188 4735 generic.go:334] "Generic (PLEG): container finished" podID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerID="832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb" exitCode=0 Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.465250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" event={"ID":"aadfd0e2-e652-4aa4-850c-8fef23ae29ea","Type":"ContainerDied","Data":"832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb"} Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.465322 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" event={"ID":"aadfd0e2-e652-4aa4-850c-8fef23ae29ea","Type":"ContainerStarted","Data":"b02d8741a9d306b30a2c226205fbe2c59b1793d69f6ef2e91bc5ccc1f1c1245a"} Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.465842 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-msdx9" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.465925 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.512937 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.551949 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.682379 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-msdx9"] Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.691359 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-msdx9"] Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.905977 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61266427-4479-4527-95ef-6f2d04a29192" path="/var/lib/kubelet/pods/61266427-4479-4527-95ef-6f2d04a29192/volumes" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.982449 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.984066 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.986006 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.986036 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lq6pz" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.987882 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 13:14:22 crc kubenswrapper[4735]: I1008 13:14:22.994042 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.005031 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114373 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sshql\" (UniqueName: \"kubernetes.io/projected/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-kube-api-access-sshql\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114525 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-scripts\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-config\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.114759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-config\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216665 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216747 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sshql\" (UniqueName: \"kubernetes.io/projected/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-kube-api-access-sshql\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216814 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.216836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-scripts\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.217327 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.218076 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-scripts\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.218500 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-config\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.221966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.222563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.222702 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.238153 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sshql\" (UniqueName: \"kubernetes.io/projected/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-kube-api-access-sshql\") pod \"ovn-northd-0\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.299159 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.483157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" event={"ID":"aadfd0e2-e652-4aa4-850c-8fef23ae29ea","Type":"ContainerStarted","Data":"144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98"} Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.483624 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.529022 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.587061 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" podStartSLOduration=3.587028458 podStartE2EDuration="3.587028458s" podCreationTimestamp="2025-10-08 13:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:14:23.510911009 +0000 UTC m=+1113.319215563" watchObservedRunningTime="2025-10-08 13:14:23.587028458 +0000 UTC m=+1113.395333052" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.589585 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-g2qzr"] Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.679323 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-zv2bg"] Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.681641 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.702536 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.721119 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-zv2bg"] Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.830713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.830802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.830864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-config\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.830939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqh7z\" (UniqueName: \"kubernetes.io/projected/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-kube-api-access-lqh7z\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.831018 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-dns-svc\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.933417 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-dns-svc\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.933515 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.933552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.933595 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-config\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.933627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqh7z\" (UniqueName: \"kubernetes.io/projected/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-kube-api-access-lqh7z\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.934606 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.934847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-config\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.934868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.936846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-dns-svc\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:23 crc kubenswrapper[4735]: I1008 13:14:23.976469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqh7z\" (UniqueName: \"kubernetes.io/projected/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-kube-api-access-lqh7z\") pod \"dnsmasq-dns-7b587f8db7-zv2bg\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.052175 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.485970 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.503238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6","Type":"ContainerStarted","Data":"aacf215356088f4b69f6cb6c70b074703759a1862de3e2ca53addb55f251d2c3"} Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.553599 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-zv2bg"] Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.591369 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.828804 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.841912 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.846701 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.849896 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.849899 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.849997 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.849908 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-644z8" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.967179 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-cache\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.967300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-lock\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.967339 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b767n\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-kube-api-access-b767n\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.967374 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:24 crc kubenswrapper[4735]: I1008 13:14:24.967411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.069786 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.069847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.069952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-cache\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.070041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-lock\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.070083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b767n\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-kube-api-access-b767n\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.071799 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.071946 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-cache\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: E1008 13:14:25.072087 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 13:14:25 crc kubenswrapper[4735]: E1008 13:14:25.072129 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 13:14:25 crc kubenswrapper[4735]: E1008 13:14:25.072184 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift podName:dc4e9066-e538-4d60-9c5a-3beb689d67a4 nodeName:}" failed. No retries permitted until 2025-10-08 13:14:25.572165852 +0000 UTC m=+1115.380470396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift") pod "swift-storage-0" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4") : configmap "swift-ring-files" not found Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.072683 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-lock\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.093496 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b767n\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-kube-api-access-b767n\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.107728 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.405186 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zfmvf"] Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.406818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.411239 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.411610 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.414154 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.422680 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zfmvf"] Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.518533 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerID="9728313bd36002d3cd4e2999764327234943fef3d025e7a717ec24bac3d7848d" exitCode=0 Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.518597 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" event={"ID":"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f","Type":"ContainerDied","Data":"9728313bd36002d3cd4e2999764327234943fef3d025e7a717ec24bac3d7848d"} Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.518631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" event={"ID":"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f","Type":"ContainerStarted","Data":"1429be662ecde2743b263af090cb6ffcb17ea9de988d9c0389967d75f585729b"} Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.521872 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6","Type":"ContainerStarted","Data":"af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe"} Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.522012 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerName="dnsmasq-dns" containerID="cri-o://144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98" gracePeriod=10 Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578578 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v77wt\" (UniqueName: \"kubernetes.io/projected/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-kube-api-access-v77wt\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-combined-ca-bundle\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-etc-swift\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-ring-data-devices\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578844 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-swiftconf\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578878 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-scripts\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.578902 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-dispersionconf\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: E1008 13:14:25.579053 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 13:14:25 crc kubenswrapper[4735]: E1008 13:14:25.579069 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 13:14:25 crc kubenswrapper[4735]: E1008 13:14:25.579106 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift podName:dc4e9066-e538-4d60-9c5a-3beb689d67a4 nodeName:}" failed. No retries permitted until 2025-10-08 13:14:26.579092141 +0000 UTC m=+1116.387396685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift") pod "swift-storage-0" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4") : configmap "swift-ring-files" not found Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680080 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-etc-swift\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-ring-data-devices\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680255 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-swiftconf\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-scripts\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-dispersionconf\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680363 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v77wt\" (UniqueName: \"kubernetes.io/projected/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-kube-api-access-v77wt\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.680392 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-combined-ca-bundle\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.681610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-etc-swift\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.682501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-ring-data-devices\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.683395 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-scripts\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.686811 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-dispersionconf\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.686966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-swiftconf\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.687525 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-combined-ca-bundle\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.703505 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v77wt\" (UniqueName: \"kubernetes.io/projected/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-kube-api-access-v77wt\") pod \"swift-ring-rebalance-zfmvf\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.726824 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:25 crc kubenswrapper[4735]: I1008 13:14:25.954897 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.086680 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-sb\") pod \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.086733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-dns-svc\") pod \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.086807 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-config\") pod \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.087510 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmtjv\" (UniqueName: \"kubernetes.io/projected/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-kube-api-access-zmtjv\") pod \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.087594 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-nb\") pod \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\" (UID: \"aadfd0e2-e652-4aa4-850c-8fef23ae29ea\") " Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.091827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-kube-api-access-zmtjv" (OuterVolumeSpecName: "kube-api-access-zmtjv") pod "aadfd0e2-e652-4aa4-850c-8fef23ae29ea" (UID: "aadfd0e2-e652-4aa4-850c-8fef23ae29ea"). InnerVolumeSpecName "kube-api-access-zmtjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.127653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aadfd0e2-e652-4aa4-850c-8fef23ae29ea" (UID: "aadfd0e2-e652-4aa4-850c-8fef23ae29ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.131962 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aadfd0e2-e652-4aa4-850c-8fef23ae29ea" (UID: "aadfd0e2-e652-4aa4-850c-8fef23ae29ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.133507 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aadfd0e2-e652-4aa4-850c-8fef23ae29ea" (UID: "aadfd0e2-e652-4aa4-850c-8fef23ae29ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.144656 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-config" (OuterVolumeSpecName: "config") pod "aadfd0e2-e652-4aa4-850c-8fef23ae29ea" (UID: "aadfd0e2-e652-4aa4-850c-8fef23ae29ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.196170 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmtjv\" (UniqueName: \"kubernetes.io/projected/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-kube-api-access-zmtjv\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.196214 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.196227 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.196240 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.196250 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadfd0e2-e652-4aa4-850c-8fef23ae29ea-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:26 crc kubenswrapper[4735]: W1008 13:14:26.255233 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c9b6bd7_6443_42da_a807_ff8fd5abbc8a.slice/crio-4545f1140b84753631e47decbbba47446325cb67f8094c81dae7089ff4207e74 WatchSource:0}: Error finding container 4545f1140b84753631e47decbbba47446325cb67f8094c81dae7089ff4207e74: Status 404 returned error can't find the container with id 4545f1140b84753631e47decbbba47446325cb67f8094c81dae7089ff4207e74 Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.262675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zfmvf"] Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.530466 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" event={"ID":"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f","Type":"ContainerStarted","Data":"d24591f84f2a215e0b89883b252d8bd0fa2853bfcd15f6e7b000372b02ec7e5d"} Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.530645 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.532043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6","Type":"ContainerStarted","Data":"14207a6e2a9596d194069fc196983081842e33d540bef5d9124e59a802e7d234"} Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.532111 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.533266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zfmvf" event={"ID":"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a","Type":"ContainerStarted","Data":"4545f1140b84753631e47decbbba47446325cb67f8094c81dae7089ff4207e74"} Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.534861 4735 generic.go:334] "Generic (PLEG): container finished" podID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerID="144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98" exitCode=0 Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.534892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" event={"ID":"aadfd0e2-e652-4aa4-850c-8fef23ae29ea","Type":"ContainerDied","Data":"144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98"} Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.534927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" event={"ID":"aadfd0e2-e652-4aa4-850c-8fef23ae29ea","Type":"ContainerDied","Data":"b02d8741a9d306b30a2c226205fbe2c59b1793d69f6ef2e91bc5ccc1f1c1245a"} Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.534941 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-g2qzr" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.534950 4735 scope.go:117] "RemoveContainer" containerID="144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.556460 4735 scope.go:117] "RemoveContainer" containerID="832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.558065 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" podStartSLOduration=3.558051216 podStartE2EDuration="3.558051216s" podCreationTimestamp="2025-10-08 13:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:14:26.548362111 +0000 UTC m=+1116.356666655" watchObservedRunningTime="2025-10-08 13:14:26.558051216 +0000 UTC m=+1116.366355760" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.579848 4735 scope.go:117] "RemoveContainer" containerID="144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98" Oct 08 13:14:26 crc kubenswrapper[4735]: E1008 13:14:26.580448 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98\": container with ID starting with 144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98 not found: ID does not exist" containerID="144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.580614 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98"} err="failed to get container status \"144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98\": rpc error: code = NotFound desc = could not find container \"144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98\": container with ID starting with 144c1ffa1d4d8d0644f0a8113485e5982a9c65223eeb58e5a54369b507c9ff98 not found: ID does not exist" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.580717 4735 scope.go:117] "RemoveContainer" containerID="832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb" Oct 08 13:14:26 crc kubenswrapper[4735]: E1008 13:14:26.581258 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb\": container with ID starting with 832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb not found: ID does not exist" containerID="832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.581309 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb"} err="failed to get container status \"832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb\": rpc error: code = NotFound desc = could not find container \"832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb\": container with ID starting with 832fdda58551a87d708e18e2801a7e7dd965cc3e8e6a0fa0c5742480ef6f7ffb not found: ID does not exist" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.586759 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.155579424 podStartE2EDuration="4.586736864s" podCreationTimestamp="2025-10-08 13:14:22 +0000 UTC" firstStartedPulling="2025-10-08 13:14:23.72957563 +0000 UTC m=+1113.537880174" lastFinishedPulling="2025-10-08 13:14:25.16073307 +0000 UTC m=+1114.969037614" observedRunningTime="2025-10-08 13:14:26.581024273 +0000 UTC m=+1116.389328817" watchObservedRunningTime="2025-10-08 13:14:26.586736864 +0000 UTC m=+1116.395041408" Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.602239 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:26 crc kubenswrapper[4735]: E1008 13:14:26.602460 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 13:14:26 crc kubenswrapper[4735]: E1008 13:14:26.602912 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 13:14:26 crc kubenswrapper[4735]: E1008 13:14:26.602970 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift podName:dc4e9066-e538-4d60-9c5a-3beb689d67a4 nodeName:}" failed. No retries permitted until 2025-10-08 13:14:28.602948401 +0000 UTC m=+1118.411252945 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift") pod "swift-storage-0" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4") : configmap "swift-ring-files" not found Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.605845 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-g2qzr"] Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.623763 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-g2qzr"] Oct 08 13:14:26 crc kubenswrapper[4735]: I1008 13:14:26.905815 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" path="/var/lib/kubelet/pods/aadfd0e2-e652-4aa4-850c-8fef23ae29ea/volumes" Oct 08 13:14:27 crc kubenswrapper[4735]: I1008 13:14:27.764146 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 13:14:27 crc kubenswrapper[4735]: I1008 13:14:27.817311 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 13:14:28 crc kubenswrapper[4735]: I1008 13:14:28.662389 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:28 crc kubenswrapper[4735]: E1008 13:14:28.662678 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 13:14:28 crc kubenswrapper[4735]: E1008 13:14:28.662800 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 13:14:28 crc kubenswrapper[4735]: E1008 13:14:28.662863 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift podName:dc4e9066-e538-4d60-9c5a-3beb689d67a4 nodeName:}" failed. No retries permitted until 2025-10-08 13:14:32.662843693 +0000 UTC m=+1122.471148307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift") pod "swift-storage-0" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4") : configmap "swift-ring-files" not found Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.413402 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4rmr9"] Oct 08 13:14:31 crc kubenswrapper[4735]: E1008 13:14:31.414171 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerName="init" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.414188 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerName="init" Oct 08 13:14:31 crc kubenswrapper[4735]: E1008 13:14:31.414201 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerName="dnsmasq-dns" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.414209 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerName="dnsmasq-dns" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.414447 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadfd0e2-e652-4aa4-850c-8fef23ae29ea" containerName="dnsmasq-dns" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.415090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.422259 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4rmr9"] Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.519262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48fqf\" (UniqueName: \"kubernetes.io/projected/e3a15a0b-b0a1-4781-be73-477726f4007c-kube-api-access-48fqf\") pod \"keystone-db-create-4rmr9\" (UID: \"e3a15a0b-b0a1-4781-be73-477726f4007c\") " pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.583143 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zfmvf" event={"ID":"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a","Type":"ContainerStarted","Data":"8ad590cd163cd0b833533bdba0e94515d2b83d9f6ef232c27857edbcdd0d8dc7"} Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.610798 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5vqkx"] Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.612327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.621842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48fqf\" (UniqueName: \"kubernetes.io/projected/e3a15a0b-b0a1-4781-be73-477726f4007c-kube-api-access-48fqf\") pod \"keystone-db-create-4rmr9\" (UID: \"e3a15a0b-b0a1-4781-be73-477726f4007c\") " pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.630245 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5vqkx"] Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.630803 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-zfmvf" podStartSLOduration=1.6042349360000001 podStartE2EDuration="6.63077919s" podCreationTimestamp="2025-10-08 13:14:25 +0000 UTC" firstStartedPulling="2025-10-08 13:14:26.257290669 +0000 UTC m=+1116.065595213" lastFinishedPulling="2025-10-08 13:14:31.283834923 +0000 UTC m=+1121.092139467" observedRunningTime="2025-10-08 13:14:31.623114837 +0000 UTC m=+1121.431419381" watchObservedRunningTime="2025-10-08 13:14:31.63077919 +0000 UTC m=+1121.439083734" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.644986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48fqf\" (UniqueName: \"kubernetes.io/projected/e3a15a0b-b0a1-4781-be73-477726f4007c-kube-api-access-48fqf\") pod \"keystone-db-create-4rmr9\" (UID: \"e3a15a0b-b0a1-4781-be73-477726f4007c\") " pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.723566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwm6\" (UniqueName: \"kubernetes.io/projected/2c371503-0a4b-4fdb-90c8-49bd2bef645e-kube-api-access-tdwm6\") pod \"placement-db-create-5vqkx\" (UID: \"2c371503-0a4b-4fdb-90c8-49bd2bef645e\") " pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.745058 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.825650 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwm6\" (UniqueName: \"kubernetes.io/projected/2c371503-0a4b-4fdb-90c8-49bd2bef645e-kube-api-access-tdwm6\") pod \"placement-db-create-5vqkx\" (UID: \"2c371503-0a4b-4fdb-90c8-49bd2bef645e\") " pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.883634 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-krzkw"] Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.884583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwm6\" (UniqueName: \"kubernetes.io/projected/2c371503-0a4b-4fdb-90c8-49bd2bef645e-kube-api-access-tdwm6\") pod \"placement-db-create-5vqkx\" (UID: \"2c371503-0a4b-4fdb-90c8-49bd2bef645e\") " pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.885302 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krzkw" Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.904420 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-krzkw"] Oct 08 13:14:31 crc kubenswrapper[4735]: I1008 13:14:31.948197 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.029225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9gbf\" (UniqueName: \"kubernetes.io/projected/6056bb74-3465-4e20-8b68-877132ab11fb-kube-api-access-z9gbf\") pod \"glance-db-create-krzkw\" (UID: \"6056bb74-3465-4e20-8b68-877132ab11fb\") " pod="openstack/glance-db-create-krzkw" Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.130707 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9gbf\" (UniqueName: \"kubernetes.io/projected/6056bb74-3465-4e20-8b68-877132ab11fb-kube-api-access-z9gbf\") pod \"glance-db-create-krzkw\" (UID: \"6056bb74-3465-4e20-8b68-877132ab11fb\") " pod="openstack/glance-db-create-krzkw" Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.148176 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9gbf\" (UniqueName: \"kubernetes.io/projected/6056bb74-3465-4e20-8b68-877132ab11fb-kube-api-access-z9gbf\") pod \"glance-db-create-krzkw\" (UID: \"6056bb74-3465-4e20-8b68-877132ab11fb\") " pod="openstack/glance-db-create-krzkw" Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.205165 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4rmr9"] Oct 08 13:14:32 crc kubenswrapper[4735]: W1008 13:14:32.207616 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a15a0b_b0a1_4781_be73_477726f4007c.slice/crio-8126b688818ca04b5e1ab8def5de00cd6ba55a0ab4628380ef17aaa518881b5a WatchSource:0}: Error finding container 8126b688818ca04b5e1ab8def5de00cd6ba55a0ab4628380ef17aaa518881b5a: Status 404 returned error can't find the container with id 8126b688818ca04b5e1ab8def5de00cd6ba55a0ab4628380ef17aaa518881b5a Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.215459 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krzkw" Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.362203 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5vqkx"] Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.456843 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-krzkw"] Oct 08 13:14:32 crc kubenswrapper[4735]: W1008 13:14:32.459739 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6056bb74_3465_4e20_8b68_877132ab11fb.slice/crio-8b7231d62ec6ca1a2d9a51a08359483a50dc62206f73568952b20230c0537f9e WatchSource:0}: Error finding container 8b7231d62ec6ca1a2d9a51a08359483a50dc62206f73568952b20230c0537f9e: Status 404 returned error can't find the container with id 8b7231d62ec6ca1a2d9a51a08359483a50dc62206f73568952b20230c0537f9e Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.592064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krzkw" event={"ID":"6056bb74-3465-4e20-8b68-877132ab11fb","Type":"ContainerStarted","Data":"8b7231d62ec6ca1a2d9a51a08359483a50dc62206f73568952b20230c0537f9e"} Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.593727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5vqkx" event={"ID":"2c371503-0a4b-4fdb-90c8-49bd2bef645e","Type":"ContainerStarted","Data":"71e09fd134d97531c7a2bd2499f7d010024e832feb3907261babbd39cdcb2c04"} Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.596234 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3a15a0b-b0a1-4781-be73-477726f4007c" containerID="213b4bd5d68f294cd07d89000b52b41d82b1fab369579e70e28047513a003eba" exitCode=0 Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.596310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4rmr9" event={"ID":"e3a15a0b-b0a1-4781-be73-477726f4007c","Type":"ContainerDied","Data":"213b4bd5d68f294cd07d89000b52b41d82b1fab369579e70e28047513a003eba"} Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.596349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4rmr9" event={"ID":"e3a15a0b-b0a1-4781-be73-477726f4007c","Type":"ContainerStarted","Data":"8126b688818ca04b5e1ab8def5de00cd6ba55a0ab4628380ef17aaa518881b5a"} Oct 08 13:14:32 crc kubenswrapper[4735]: I1008 13:14:32.739592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:32 crc kubenswrapper[4735]: E1008 13:14:32.739762 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 13:14:32 crc kubenswrapper[4735]: E1008 13:14:32.740058 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 13:14:32 crc kubenswrapper[4735]: E1008 13:14:32.740166 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift podName:dc4e9066-e538-4d60-9c5a-3beb689d67a4 nodeName:}" failed. No retries permitted until 2025-10-08 13:14:40.740143527 +0000 UTC m=+1130.548448091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift") pod "swift-storage-0" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4") : configmap "swift-ring-files" not found Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.623025 4735 generic.go:334] "Generic (PLEG): container finished" podID="2c371503-0a4b-4fdb-90c8-49bd2bef645e" containerID="33ec5f1c7804a57807089942addba2fa4640a3d316649a488561eedd5c819134" exitCode=0 Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.623076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5vqkx" event={"ID":"2c371503-0a4b-4fdb-90c8-49bd2bef645e","Type":"ContainerDied","Data":"33ec5f1c7804a57807089942addba2fa4640a3d316649a488561eedd5c819134"} Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.626519 4735 generic.go:334] "Generic (PLEG): container finished" podID="6056bb74-3465-4e20-8b68-877132ab11fb" containerID="2216655515a74b06489f85b13335788b90aae03c4ad2036f42b36ac4d920c85b" exitCode=0 Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.626598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krzkw" event={"ID":"6056bb74-3465-4e20-8b68-877132ab11fb","Type":"ContainerDied","Data":"2216655515a74b06489f85b13335788b90aae03c4ad2036f42b36ac4d920c85b"} Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.915703 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.961119 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48fqf\" (UniqueName: \"kubernetes.io/projected/e3a15a0b-b0a1-4781-be73-477726f4007c-kube-api-access-48fqf\") pod \"e3a15a0b-b0a1-4781-be73-477726f4007c\" (UID: \"e3a15a0b-b0a1-4781-be73-477726f4007c\") " Oct 08 13:14:33 crc kubenswrapper[4735]: I1008 13:14:33.971707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a15a0b-b0a1-4781-be73-477726f4007c-kube-api-access-48fqf" (OuterVolumeSpecName: "kube-api-access-48fqf") pod "e3a15a0b-b0a1-4781-be73-477726f4007c" (UID: "e3a15a0b-b0a1-4781-be73-477726f4007c"). InnerVolumeSpecName "kube-api-access-48fqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.054389 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.063343 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48fqf\" (UniqueName: \"kubernetes.io/projected/e3a15a0b-b0a1-4781-be73-477726f4007c-kube-api-access-48fqf\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.104022 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-vdnj6"] Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.104275 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77597f887-vdnj6" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerName="dnsmasq-dns" containerID="cri-o://de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338" gracePeriod=10 Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.588942 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.636222 4735 generic.go:334] "Generic (PLEG): container finished" podID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerID="de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338" exitCode=0 Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.636299 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-vdnj6" event={"ID":"dbf29341-130d-4282-b4a2-31e745b9a6c1","Type":"ContainerDied","Data":"de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338"} Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.636330 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-vdnj6" event={"ID":"dbf29341-130d-4282-b4a2-31e745b9a6c1","Type":"ContainerDied","Data":"161d99a3851bb0887a50470038609e001f1a71ebd91f66f1ac98ba8cbc5846f6"} Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.636349 4735 scope.go:117] "RemoveContainer" containerID="de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.636466 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-vdnj6" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.639342 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4rmr9" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.639566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4rmr9" event={"ID":"e3a15a0b-b0a1-4781-be73-477726f4007c","Type":"ContainerDied","Data":"8126b688818ca04b5e1ab8def5de00cd6ba55a0ab4628380ef17aaa518881b5a"} Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.639617 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8126b688818ca04b5e1ab8def5de00cd6ba55a0ab4628380ef17aaa518881b5a" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.661438 4735 scope.go:117] "RemoveContainer" containerID="90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.674439 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-config\") pod \"dbf29341-130d-4282-b4a2-31e745b9a6c1\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.674583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-dns-svc\") pod \"dbf29341-130d-4282-b4a2-31e745b9a6c1\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.674681 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk2b9\" (UniqueName: \"kubernetes.io/projected/dbf29341-130d-4282-b4a2-31e745b9a6c1-kube-api-access-jk2b9\") pod \"dbf29341-130d-4282-b4a2-31e745b9a6c1\" (UID: \"dbf29341-130d-4282-b4a2-31e745b9a6c1\") " Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.681696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf29341-130d-4282-b4a2-31e745b9a6c1-kube-api-access-jk2b9" (OuterVolumeSpecName: "kube-api-access-jk2b9") pod "dbf29341-130d-4282-b4a2-31e745b9a6c1" (UID: "dbf29341-130d-4282-b4a2-31e745b9a6c1"). InnerVolumeSpecName "kube-api-access-jk2b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.689650 4735 scope.go:117] "RemoveContainer" containerID="de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338" Oct 08 13:14:34 crc kubenswrapper[4735]: E1008 13:14:34.690188 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338\": container with ID starting with de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338 not found: ID does not exist" containerID="de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.690231 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338"} err="failed to get container status \"de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338\": rpc error: code = NotFound desc = could not find container \"de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338\": container with ID starting with de2b2b8feae86d9c4fe7cf5c3f2d7ca1bd188126b7f873e35cf70f7755f18338 not found: ID does not exist" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.690255 4735 scope.go:117] "RemoveContainer" containerID="90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92" Oct 08 13:14:34 crc kubenswrapper[4735]: E1008 13:14:34.693707 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92\": container with ID starting with 90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92 not found: ID does not exist" containerID="90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.693737 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92"} err="failed to get container status \"90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92\": rpc error: code = NotFound desc = could not find container \"90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92\": container with ID starting with 90cead34c38543d5b210e9e02e51b48a4b6cbc765660613866b6ebe3ceb1ea92 not found: ID does not exist" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.717291 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-config" (OuterVolumeSpecName: "config") pod "dbf29341-130d-4282-b4a2-31e745b9a6c1" (UID: "dbf29341-130d-4282-b4a2-31e745b9a6c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.720944 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbf29341-130d-4282-b4a2-31e745b9a6c1" (UID: "dbf29341-130d-4282-b4a2-31e745b9a6c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.777806 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk2b9\" (UniqueName: \"kubernetes.io/projected/dbf29341-130d-4282-b4a2-31e745b9a6c1-kube-api-access-jk2b9\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.777851 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:34 crc kubenswrapper[4735]: I1008 13:14:34.777862 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf29341-130d-4282-b4a2-31e745b9a6c1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.016104 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krzkw" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.031958 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-vdnj6"] Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.032862 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.037909 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77597f887-vdnj6"] Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.080963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwm6\" (UniqueName: \"kubernetes.io/projected/2c371503-0a4b-4fdb-90c8-49bd2bef645e-kube-api-access-tdwm6\") pod \"2c371503-0a4b-4fdb-90c8-49bd2bef645e\" (UID: \"2c371503-0a4b-4fdb-90c8-49bd2bef645e\") " Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.081148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9gbf\" (UniqueName: \"kubernetes.io/projected/6056bb74-3465-4e20-8b68-877132ab11fb-kube-api-access-z9gbf\") pod \"6056bb74-3465-4e20-8b68-877132ab11fb\" (UID: \"6056bb74-3465-4e20-8b68-877132ab11fb\") " Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.084795 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c371503-0a4b-4fdb-90c8-49bd2bef645e-kube-api-access-tdwm6" (OuterVolumeSpecName: "kube-api-access-tdwm6") pod "2c371503-0a4b-4fdb-90c8-49bd2bef645e" (UID: "2c371503-0a4b-4fdb-90c8-49bd2bef645e"). InnerVolumeSpecName "kube-api-access-tdwm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.092827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6056bb74-3465-4e20-8b68-877132ab11fb-kube-api-access-z9gbf" (OuterVolumeSpecName: "kube-api-access-z9gbf") pod "6056bb74-3465-4e20-8b68-877132ab11fb" (UID: "6056bb74-3465-4e20-8b68-877132ab11fb"). InnerVolumeSpecName "kube-api-access-z9gbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:35 crc kubenswrapper[4735]: E1008 13:14:35.134237 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf29341_130d_4282_b4a2_31e745b9a6c1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf29341_130d_4282_b4a2_31e745b9a6c1.slice/crio-161d99a3851bb0887a50470038609e001f1a71ebd91f66f1ac98ba8cbc5846f6\": RecentStats: unable to find data in memory cache]" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.182938 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9gbf\" (UniqueName: \"kubernetes.io/projected/6056bb74-3465-4e20-8b68-877132ab11fb-kube-api-access-z9gbf\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.182973 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwm6\" (UniqueName: \"kubernetes.io/projected/2c371503-0a4b-4fdb-90c8-49bd2bef645e-kube-api-access-tdwm6\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.652144 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krzkw" event={"ID":"6056bb74-3465-4e20-8b68-877132ab11fb","Type":"ContainerDied","Data":"8b7231d62ec6ca1a2d9a51a08359483a50dc62206f73568952b20230c0537f9e"} Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.652509 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7231d62ec6ca1a2d9a51a08359483a50dc62206f73568952b20230c0537f9e" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.652184 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krzkw" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.654212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5vqkx" event={"ID":"2c371503-0a4b-4fdb-90c8-49bd2bef645e","Type":"ContainerDied","Data":"71e09fd134d97531c7a2bd2499f7d010024e832feb3907261babbd39cdcb2c04"} Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.654240 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71e09fd134d97531c7a2bd2499f7d010024e832feb3907261babbd39cdcb2c04" Oct 08 13:14:35 crc kubenswrapper[4735]: I1008 13:14:35.654314 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5vqkx" Oct 08 13:14:36 crc kubenswrapper[4735]: I1008 13:14:36.904833 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" path="/var/lib/kubelet/pods/dbf29341-130d-4282-b4a2-31e745b9a6c1/volumes" Oct 08 13:14:38 crc kubenswrapper[4735]: I1008 13:14:38.355532 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 13:14:38 crc kubenswrapper[4735]: I1008 13:14:38.679133 4735 generic.go:334] "Generic (PLEG): container finished" podID="7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" containerID="8ad590cd163cd0b833533bdba0e94515d2b83d9f6ef232c27857edbcdd0d8dc7" exitCode=0 Oct 08 13:14:38 crc kubenswrapper[4735]: I1008 13:14:38.679181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zfmvf" event={"ID":"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a","Type":"ContainerDied","Data":"8ad590cd163cd0b833533bdba0e94515d2b83d9f6ef232c27857edbcdd0d8dc7"} Oct 08 13:14:39 crc kubenswrapper[4735]: I1008 13:14:39.988224 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066401 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v77wt\" (UniqueName: \"kubernetes.io/projected/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-kube-api-access-v77wt\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066522 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-scripts\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-ring-data-devices\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066702 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-etc-swift\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066775 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-dispersionconf\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066800 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-combined-ca-bundle\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.066824 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-swiftconf\") pod \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\" (UID: \"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a\") " Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.068025 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.068251 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.072784 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-kube-api-access-v77wt" (OuterVolumeSpecName: "kube-api-access-v77wt") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "kube-api-access-v77wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.075224 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.088103 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-scripts" (OuterVolumeSpecName: "scripts") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.089732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.090086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" (UID: "7c9b6bd7-6443-42da-a807-ff8fd5abbc8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.168902 4735 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.169159 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.169233 4735 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.169299 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v77wt\" (UniqueName: \"kubernetes.io/projected/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-kube-api-access-v77wt\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.169363 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.169426 4735 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.169507 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.702190 4735 generic.go:334] "Generic (PLEG): container finished" podID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerID="08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0" exitCode=0 Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.702347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"644f0099-ec9c-42af-bbe8-54fe5b603bb3","Type":"ContainerDied","Data":"08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0"} Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.704586 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zfmvf" event={"ID":"7c9b6bd7-6443-42da-a807-ff8fd5abbc8a","Type":"ContainerDied","Data":"4545f1140b84753631e47decbbba47446325cb67f8094c81dae7089ff4207e74"} Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.704630 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4545f1140b84753631e47decbbba47446325cb67f8094c81dae7089ff4207e74" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.704700 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zfmvf" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.778419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:40 crc kubenswrapper[4735]: I1008 13:14:40.787815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"swift-storage-0\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " pod="openstack/swift-storage-0" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.076665 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455006 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4fa4-account-create-k5c9z"] Oct 08 13:14:41 crc kubenswrapper[4735]: E1008 13:14:41.455685 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6056bb74-3465-4e20-8b68-877132ab11fb" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455707 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6056bb74-3465-4e20-8b68-877132ab11fb" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: E1008 13:14:41.455717 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c371503-0a4b-4fdb-90c8-49bd2bef645e" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455725 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c371503-0a4b-4fdb-90c8-49bd2bef645e" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: E1008 13:14:41.455739 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" containerName="swift-ring-rebalance" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455746 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" containerName="swift-ring-rebalance" Oct 08 13:14:41 crc kubenswrapper[4735]: E1008 13:14:41.455766 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerName="dnsmasq-dns" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455772 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerName="dnsmasq-dns" Oct 08 13:14:41 crc kubenswrapper[4735]: E1008 13:14:41.455786 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerName="init" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455792 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerName="init" Oct 08 13:14:41 crc kubenswrapper[4735]: E1008 13:14:41.455804 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a15a0b-b0a1-4781-be73-477726f4007c" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.455811 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a15a0b-b0a1-4781-be73-477726f4007c" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.456001 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6056bb74-3465-4e20-8b68-877132ab11fb" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.456020 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf29341-130d-4282-b4a2-31e745b9a6c1" containerName="dnsmasq-dns" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.456034 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c371503-0a4b-4fdb-90c8-49bd2bef645e" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.456043 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" containerName="swift-ring-rebalance" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.456053 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a15a0b-b0a1-4781-be73-477726f4007c" containerName="mariadb-database-create" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.456677 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.458453 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.466442 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4fa4-account-create-k5c9z"] Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.490432 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppvw8\" (UniqueName: \"kubernetes.io/projected/579bb7b5-6f36-4ea3-a772-eb749195cf8b-kube-api-access-ppvw8\") pod \"keystone-4fa4-account-create-k5c9z\" (UID: \"579bb7b5-6f36-4ea3-a772-eb749195cf8b\") " pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.592296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppvw8\" (UniqueName: \"kubernetes.io/projected/579bb7b5-6f36-4ea3-a772-eb749195cf8b-kube-api-access-ppvw8\") pod \"keystone-4fa4-account-create-k5c9z\" (UID: \"579bb7b5-6f36-4ea3-a772-eb749195cf8b\") " pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.595040 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.623652 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppvw8\" (UniqueName: \"kubernetes.io/projected/579bb7b5-6f36-4ea3-a772-eb749195cf8b-kube-api-access-ppvw8\") pod \"keystone-4fa4-account-create-k5c9z\" (UID: \"579bb7b5-6f36-4ea3-a772-eb749195cf8b\") " pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.713758 4735 generic.go:334] "Generic (PLEG): container finished" podID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerID="8b5489cc563b906768d49040d0dc9d8ee683e0a1a8c2ad6e11328dcf2b18d053" exitCode=0 Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.713850 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ad2819-cfc3-4416-8a27-ecb65c2fa366","Type":"ContainerDied","Data":"8b5489cc563b906768d49040d0dc9d8ee683e0a1a8c2ad6e11328dcf2b18d053"} Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.716768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"644f0099-ec9c-42af-bbe8-54fe5b603bb3","Type":"ContainerStarted","Data":"898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356"} Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.716920 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.717935 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"79c4d8dddaa9cd10fa75a654df1c84c66f8a311d7a0d6eb6b62b00c0d47c8031"} Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.754648 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-364a-account-create-wfnvv"] Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.755894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.757554 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.773119 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.776412 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-364a-account-create-wfnvv"] Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.777914 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.197313752 podStartE2EDuration="55.777903581s" podCreationTimestamp="2025-10-08 13:13:46 +0000 UTC" firstStartedPulling="2025-10-08 13:13:48.687543877 +0000 UTC m=+1078.495848421" lastFinishedPulling="2025-10-08 13:14:07.268133716 +0000 UTC m=+1097.076438250" observedRunningTime="2025-10-08 13:14:41.763044858 +0000 UTC m=+1131.571349403" watchObservedRunningTime="2025-10-08 13:14:41.777903581 +0000 UTC m=+1131.586208125" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.795058 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rc9s\" (UniqueName: \"kubernetes.io/projected/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8-kube-api-access-9rc9s\") pod \"placement-364a-account-create-wfnvv\" (UID: \"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8\") " pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.897569 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rc9s\" (UniqueName: \"kubernetes.io/projected/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8-kube-api-access-9rc9s\") pod \"placement-364a-account-create-wfnvv\" (UID: \"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8\") " pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.915528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rc9s\" (UniqueName: \"kubernetes.io/projected/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8-kube-api-access-9rc9s\") pod \"placement-364a-account-create-wfnvv\" (UID: \"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8\") " pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.954102 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-55cc-account-create-cfvt9"] Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.955392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.962927 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 13:14:41 crc kubenswrapper[4735]: I1008 13:14:41.967056 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-55cc-account-create-cfvt9"] Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.000514 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmxz\" (UniqueName: \"kubernetes.io/projected/66bc99ab-4fa6-447b-8dbb-b612ed9e0036-kube-api-access-dnmxz\") pod \"glance-55cc-account-create-cfvt9\" (UID: \"66bc99ab-4fa6-447b-8dbb-b612ed9e0036\") " pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.050989 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4fa4-account-create-k5c9z"] Oct 08 13:14:42 crc kubenswrapper[4735]: W1008 13:14:42.054026 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579bb7b5_6f36_4ea3_a772_eb749195cf8b.slice/crio-721cd8ba0652101fc8dcef91fd595dc8cdc1af6c062809a84ec636395cae12f9 WatchSource:0}: Error finding container 721cd8ba0652101fc8dcef91fd595dc8cdc1af6c062809a84ec636395cae12f9: Status 404 returned error can't find the container with id 721cd8ba0652101fc8dcef91fd595dc8cdc1af6c062809a84ec636395cae12f9 Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.075079 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.101874 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmxz\" (UniqueName: \"kubernetes.io/projected/66bc99ab-4fa6-447b-8dbb-b612ed9e0036-kube-api-access-dnmxz\") pod \"glance-55cc-account-create-cfvt9\" (UID: \"66bc99ab-4fa6-447b-8dbb-b612ed9e0036\") " pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.119114 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmxz\" (UniqueName: \"kubernetes.io/projected/66bc99ab-4fa6-447b-8dbb-b612ed9e0036-kube-api-access-dnmxz\") pod \"glance-55cc-account-create-cfvt9\" (UID: \"66bc99ab-4fa6-447b-8dbb-b612ed9e0036\") " pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.336409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.537429 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-364a-account-create-wfnvv"] Oct 08 13:14:42 crc kubenswrapper[4735]: W1008 13:14:42.545662 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc2b0c9_48ad_4541_8f5e_71e6929a2aa8.slice/crio-d23145b42551e932b74e2478c722fcbfdca0835820c68f001b587a29de88745c WatchSource:0}: Error finding container d23145b42551e932b74e2478c722fcbfdca0835820c68f001b587a29de88745c: Status 404 returned error can't find the container with id d23145b42551e932b74e2478c722fcbfdca0835820c68f001b587a29de88745c Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.727057 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-364a-account-create-wfnvv" event={"ID":"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8","Type":"ContainerStarted","Data":"d23145b42551e932b74e2478c722fcbfdca0835820c68f001b587a29de88745c"} Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.729944 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ad2819-cfc3-4416-8a27-ecb65c2fa366","Type":"ContainerStarted","Data":"18cff9aa331d4910576ccc409a19da98fba4e35b7118162bd1347fc1d99cf4af"} Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.730112 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.731921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4fa4-account-create-k5c9z" event={"ID":"579bb7b5-6f36-4ea3-a772-eb749195cf8b","Type":"ContainerStarted","Data":"bf84946c5afe4f0e453aa213bd2807af5b2ca49428a3592a40e601015d6e8947"} Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.731967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4fa4-account-create-k5c9z" event={"ID":"579bb7b5-6f36-4ea3-a772-eb749195cf8b","Type":"ContainerStarted","Data":"721cd8ba0652101fc8dcef91fd595dc8cdc1af6c062809a84ec636395cae12f9"} Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.758167 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.154183933 podStartE2EDuration="55.75814153s" podCreationTimestamp="2025-10-08 13:13:47 +0000 UTC" firstStartedPulling="2025-10-08 13:13:53.681424874 +0000 UTC m=+1083.489729428" lastFinishedPulling="2025-10-08 13:14:07.285382481 +0000 UTC m=+1097.093687025" observedRunningTime="2025-10-08 13:14:42.753112077 +0000 UTC m=+1132.561416651" watchObservedRunningTime="2025-10-08 13:14:42.75814153 +0000 UTC m=+1132.566446094" Oct 08 13:14:42 crc kubenswrapper[4735]: I1008 13:14:42.791411 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-55cc-account-create-cfvt9"] Oct 08 13:14:42 crc kubenswrapper[4735]: W1008 13:14:42.799993 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66bc99ab_4fa6_447b_8dbb_b612ed9e0036.slice/crio-21144c5c085ee0876fc1e6ecac15287e7317260ce23cd932d19ff72efc2a2885 WatchSource:0}: Error finding container 21144c5c085ee0876fc1e6ecac15287e7317260ce23cd932d19ff72efc2a2885: Status 404 returned error can't find the container with id 21144c5c085ee0876fc1e6ecac15287e7317260ce23cd932d19ff72efc2a2885 Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.742220 4735 generic.go:334] "Generic (PLEG): container finished" podID="579bb7b5-6f36-4ea3-a772-eb749195cf8b" containerID="bf84946c5afe4f0e453aa213bd2807af5b2ca49428a3592a40e601015d6e8947" exitCode=0 Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.742394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4fa4-account-create-k5c9z" event={"ID":"579bb7b5-6f36-4ea3-a772-eb749195cf8b","Type":"ContainerDied","Data":"bf84946c5afe4f0e453aa213bd2807af5b2ca49428a3592a40e601015d6e8947"} Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.744230 4735 generic.go:334] "Generic (PLEG): container finished" podID="66bc99ab-4fa6-447b-8dbb-b612ed9e0036" containerID="6d1206de05e8eb6d4246b908ca40c1044b37559d02183b92feb25fb16c4e9316" exitCode=0 Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.744575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-55cc-account-create-cfvt9" event={"ID":"66bc99ab-4fa6-447b-8dbb-b612ed9e0036","Type":"ContainerDied","Data":"6d1206de05e8eb6d4246b908ca40c1044b37559d02183b92feb25fb16c4e9316"} Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.744746 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-55cc-account-create-cfvt9" event={"ID":"66bc99ab-4fa6-447b-8dbb-b612ed9e0036","Type":"ContainerStarted","Data":"21144c5c085ee0876fc1e6ecac15287e7317260ce23cd932d19ff72efc2a2885"} Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.749547 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-364a-account-create-wfnvv" event={"ID":"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8","Type":"ContainerDied","Data":"bbca8dc5ac414db94afe3dc18ae340927ca57130f3e94f5435899ebab773806e"} Oct 08 13:14:43 crc kubenswrapper[4735]: I1008 13:14:43.749185 4735 generic.go:334] "Generic (PLEG): container finished" podID="5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8" containerID="bbca8dc5ac414db94afe3dc18ae340927ca57130f3e94f5435899ebab773806e" exitCode=0 Oct 08 13:14:44 crc kubenswrapper[4735]: I1008 13:14:44.759034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c"} Oct 08 13:14:44 crc kubenswrapper[4735]: I1008 13:14:44.759333 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d"} Oct 08 13:14:44 crc kubenswrapper[4735]: I1008 13:14:44.759346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2"} Oct 08 13:14:44 crc kubenswrapper[4735]: I1008 13:14:44.759354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6"} Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.118450 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.226447 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.235701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.254543 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rc9s\" (UniqueName: \"kubernetes.io/projected/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8-kube-api-access-9rc9s\") pod \"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8\" (UID: \"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8\") " Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.262657 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8-kube-api-access-9rc9s" (OuterVolumeSpecName: "kube-api-access-9rc9s") pod "5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8" (UID: "5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8"). InnerVolumeSpecName "kube-api-access-9rc9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.356001 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmxz\" (UniqueName: \"kubernetes.io/projected/66bc99ab-4fa6-447b-8dbb-b612ed9e0036-kube-api-access-dnmxz\") pod \"66bc99ab-4fa6-447b-8dbb-b612ed9e0036\" (UID: \"66bc99ab-4fa6-447b-8dbb-b612ed9e0036\") " Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.356108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppvw8\" (UniqueName: \"kubernetes.io/projected/579bb7b5-6f36-4ea3-a772-eb749195cf8b-kube-api-access-ppvw8\") pod \"579bb7b5-6f36-4ea3-a772-eb749195cf8b\" (UID: \"579bb7b5-6f36-4ea3-a772-eb749195cf8b\") " Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.356635 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rc9s\" (UniqueName: \"kubernetes.io/projected/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8-kube-api-access-9rc9s\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.359501 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66bc99ab-4fa6-447b-8dbb-b612ed9e0036-kube-api-access-dnmxz" (OuterVolumeSpecName: "kube-api-access-dnmxz") pod "66bc99ab-4fa6-447b-8dbb-b612ed9e0036" (UID: "66bc99ab-4fa6-447b-8dbb-b612ed9e0036"). InnerVolumeSpecName "kube-api-access-dnmxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.359595 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579bb7b5-6f36-4ea3-a772-eb749195cf8b-kube-api-access-ppvw8" (OuterVolumeSpecName: "kube-api-access-ppvw8") pod "579bb7b5-6f36-4ea3-a772-eb749195cf8b" (UID: "579bb7b5-6f36-4ea3-a772-eb749195cf8b"). InnerVolumeSpecName "kube-api-access-ppvw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.458172 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmxz\" (UniqueName: \"kubernetes.io/projected/66bc99ab-4fa6-447b-8dbb-b612ed9e0036-kube-api-access-dnmxz\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.458217 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppvw8\" (UniqueName: \"kubernetes.io/projected/579bb7b5-6f36-4ea3-a772-eb749195cf8b-kube-api-access-ppvw8\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.768751 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-364a-account-create-wfnvv" event={"ID":"5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8","Type":"ContainerDied","Data":"d23145b42551e932b74e2478c722fcbfdca0835820c68f001b587a29de88745c"} Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.768801 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23145b42551e932b74e2478c722fcbfdca0835820c68f001b587a29de88745c" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.768772 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-364a-account-create-wfnvv" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.770257 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4fa4-account-create-k5c9z" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.770252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4fa4-account-create-k5c9z" event={"ID":"579bb7b5-6f36-4ea3-a772-eb749195cf8b","Type":"ContainerDied","Data":"721cd8ba0652101fc8dcef91fd595dc8cdc1af6c062809a84ec636395cae12f9"} Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.770336 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="721cd8ba0652101fc8dcef91fd595dc8cdc1af6c062809a84ec636395cae12f9" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.771659 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-55cc-account-create-cfvt9" event={"ID":"66bc99ab-4fa6-447b-8dbb-b612ed9e0036","Type":"ContainerDied","Data":"21144c5c085ee0876fc1e6ecac15287e7317260ce23cd932d19ff72efc2a2885"} Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.771677 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21144c5c085ee0876fc1e6ecac15287e7317260ce23cd932d19ff72efc2a2885" Oct 08 13:14:45 crc kubenswrapper[4735]: I1008 13:14:45.771697 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-55cc-account-create-cfvt9" Oct 08 13:14:46 crc kubenswrapper[4735]: I1008 13:14:46.782116 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c"} Oct 08 13:14:46 crc kubenswrapper[4735]: I1008 13:14:46.782620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124"} Oct 08 13:14:46 crc kubenswrapper[4735]: I1008 13:14:46.782630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9"} Oct 08 13:14:46 crc kubenswrapper[4735]: I1008 13:14:46.782639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4"} Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.087881 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-c8pnx"] Oct 08 13:14:47 crc kubenswrapper[4735]: E1008 13:14:47.088269 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.088287 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: E1008 13:14:47.088297 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579bb7b5-6f36-4ea3-a772-eb749195cf8b" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.088303 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="579bb7b5-6f36-4ea3-a772-eb749195cf8b" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: E1008 13:14:47.088328 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bc99ab-4fa6-447b-8dbb-b612ed9e0036" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.088337 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bc99ab-4fa6-447b-8dbb-b612ed9e0036" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.088592 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.088635 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="579bb7b5-6f36-4ea3-a772-eb749195cf8b" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.088654 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bc99ab-4fa6-447b-8dbb-b612ed9e0036" containerName="mariadb-account-create" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.089293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.091367 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.091488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-76gcn" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.095141 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-c8pnx"] Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.283609 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbgdb\" (UniqueName: \"kubernetes.io/projected/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-kube-api-access-rbgdb\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.283741 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-config-data\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.283942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-db-sync-config-data\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.284108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-combined-ca-bundle\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.316446 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pbkkt" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerName="ovn-controller" probeResult="failure" output=< Oct 08 13:14:47 crc kubenswrapper[4735]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 13:14:47 crc kubenswrapper[4735]: > Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.387409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-db-sync-config-data\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.387536 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-combined-ca-bundle\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.387603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbgdb\" (UniqueName: \"kubernetes.io/projected/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-kube-api-access-rbgdb\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.387638 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-config-data\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.393219 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-db-sync-config-data\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.393534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-combined-ca-bundle\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.394362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-config-data\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.402103 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.408523 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbgdb\" (UniqueName: \"kubernetes.io/projected/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-kube-api-access-rbgdb\") pod \"glance-db-sync-c8pnx\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.411834 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.638698 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pbkkt-config-v6w7j"] Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.639967 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.644924 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.652363 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbkkt-config-v6w7j"] Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.705988 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c8pnx" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.794272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-scripts\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.794427 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-log-ovn\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.794462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-additional-scripts\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.794564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run-ovn\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.794595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.794672 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqjtk\" (UniqueName: \"kubernetes.io/projected/d33bed52-756e-49da-8671-8cadf9dc3bcf-kube-api-access-tqjtk\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.895796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run-ovn\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.895849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.895918 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqjtk\" (UniqueName: \"kubernetes.io/projected/d33bed52-756e-49da-8671-8cadf9dc3bcf-kube-api-access-tqjtk\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.895946 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-scripts\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.895972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-log-ovn\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.895993 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-additional-scripts\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.896665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.896669 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run-ovn\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.896669 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-log-ovn\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.897140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-additional-scripts\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.898941 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-scripts\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.913546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqjtk\" (UniqueName: \"kubernetes.io/projected/d33bed52-756e-49da-8671-8cadf9dc3bcf-kube-api-access-tqjtk\") pod \"ovn-controller-pbkkt-config-v6w7j\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:47 crc kubenswrapper[4735]: I1008 13:14:47.971143 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:48 crc kubenswrapper[4735]: I1008 13:14:48.232870 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-c8pnx"] Oct 08 13:14:48 crc kubenswrapper[4735]: W1008 13:14:48.236917 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f8ee345_a4ac_49aa_9eed_32a7cffb2790.slice/crio-9dd15936a492e1181725f5900c2d068e1bcd017b408b4ece534ac96fea824457 WatchSource:0}: Error finding container 9dd15936a492e1181725f5900c2d068e1bcd017b408b4ece534ac96fea824457: Status 404 returned error can't find the container with id 9dd15936a492e1181725f5900c2d068e1bcd017b408b4ece534ac96fea824457 Oct 08 13:14:48 crc kubenswrapper[4735]: I1008 13:14:48.387591 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbkkt-config-v6w7j"] Oct 08 13:14:48 crc kubenswrapper[4735]: I1008 13:14:48.799997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt-config-v6w7j" event={"ID":"d33bed52-756e-49da-8671-8cadf9dc3bcf","Type":"ContainerStarted","Data":"8adc48f174ec93d4876f8b45b5444d75106515b1187ae726ffff74b1f0ee0e9c"} Oct 08 13:14:48 crc kubenswrapper[4735]: I1008 13:14:48.801058 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c8pnx" event={"ID":"8f8ee345-a4ac-49aa-9eed-32a7cffb2790","Type":"ContainerStarted","Data":"9dd15936a492e1181725f5900c2d068e1bcd017b408b4ece534ac96fea824457"} Oct 08 13:14:49 crc kubenswrapper[4735]: I1008 13:14:49.810307 4735 generic.go:334] "Generic (PLEG): container finished" podID="d33bed52-756e-49da-8671-8cadf9dc3bcf" containerID="18ecea8e5b37f608600ac08acead0be77a04c7cd2fcb23d381bcbe859074add1" exitCode=0 Oct 08 13:14:49 crc kubenswrapper[4735]: I1008 13:14:49.810356 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt-config-v6w7j" event={"ID":"d33bed52-756e-49da-8671-8cadf9dc3bcf","Type":"ContainerDied","Data":"18ecea8e5b37f608600ac08acead0be77a04c7cd2fcb23d381bcbe859074add1"} Oct 08 13:14:49 crc kubenswrapper[4735]: I1008 13:14:49.833495 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97"} Oct 08 13:14:49 crc kubenswrapper[4735]: I1008 13:14:49.833543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b"} Oct 08 13:14:50 crc kubenswrapper[4735]: I1008 13:14:50.853995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd"} Oct 08 13:14:50 crc kubenswrapper[4735]: I1008 13:14:50.854603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b"} Oct 08 13:14:50 crc kubenswrapper[4735]: I1008 13:14:50.854624 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9"} Oct 08 13:14:50 crc kubenswrapper[4735]: I1008 13:14:50.854636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4"} Oct 08 13:14:50 crc kubenswrapper[4735]: I1008 13:14:50.854648 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerStarted","Data":"237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff"} Oct 08 13:14:50 crc kubenswrapper[4735]: I1008 13:14:50.913347 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.109799321 podStartE2EDuration="27.913318184s" podCreationTimestamp="2025-10-08 13:14:23 +0000 UTC" firstStartedPulling="2025-10-08 13:14:41.617689782 +0000 UTC m=+1131.425994326" lastFinishedPulling="2025-10-08 13:14:49.421208645 +0000 UTC m=+1139.229513189" observedRunningTime="2025-10-08 13:14:50.910741146 +0000 UTC m=+1140.719045690" watchObservedRunningTime="2025-10-08 13:14:50.913318184 +0000 UTC m=+1140.721622728" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.170936 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-8hd2q"] Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.172787 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.174590 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.187040 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-8hd2q"] Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.220217 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run\") pod \"d33bed52-756e-49da-8671-8cadf9dc3bcf\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363191 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-log-ovn\") pod \"d33bed52-756e-49da-8671-8cadf9dc3bcf\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363263 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run" (OuterVolumeSpecName: "var-run") pod "d33bed52-756e-49da-8671-8cadf9dc3bcf" (UID: "d33bed52-756e-49da-8671-8cadf9dc3bcf"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363289 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run-ovn\") pod \"d33bed52-756e-49da-8671-8cadf9dc3bcf\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363307 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d33bed52-756e-49da-8671-8cadf9dc3bcf" (UID: "d33bed52-756e-49da-8671-8cadf9dc3bcf"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363333 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d33bed52-756e-49da-8671-8cadf9dc3bcf" (UID: "d33bed52-756e-49da-8671-8cadf9dc3bcf"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363353 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-additional-scripts\") pod \"d33bed52-756e-49da-8671-8cadf9dc3bcf\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363392 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqjtk\" (UniqueName: \"kubernetes.io/projected/d33bed52-756e-49da-8671-8cadf9dc3bcf-kube-api-access-tqjtk\") pod \"d33bed52-756e-49da-8671-8cadf9dc3bcf\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-scripts\") pod \"d33bed52-756e-49da-8671-8cadf9dc3bcf\" (UID: \"d33bed52-756e-49da-8671-8cadf9dc3bcf\") " Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h2tb\" (UniqueName: \"kubernetes.io/projected/d8270ba1-30fc-45a6-870e-89853cba5d2b-kube-api-access-6h2tb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363851 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-nb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363900 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-svc\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-config\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.363959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-swift-storage-0\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.364056 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-sb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.364171 4735 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.364200 4735 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.364213 4735 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d33bed52-756e-49da-8671-8cadf9dc3bcf-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.364066 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d33bed52-756e-49da-8671-8cadf9dc3bcf" (UID: "d33bed52-756e-49da-8671-8cadf9dc3bcf"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.364826 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-scripts" (OuterVolumeSpecName: "scripts") pod "d33bed52-756e-49da-8671-8cadf9dc3bcf" (UID: "d33bed52-756e-49da-8671-8cadf9dc3bcf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.374908 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33bed52-756e-49da-8671-8cadf9dc3bcf-kube-api-access-tqjtk" (OuterVolumeSpecName: "kube-api-access-tqjtk") pod "d33bed52-756e-49da-8671-8cadf9dc3bcf" (UID: "d33bed52-756e-49da-8671-8cadf9dc3bcf"). InnerVolumeSpecName "kube-api-access-tqjtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.466025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-nb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.466806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-svc\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.466899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-config\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.466945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-swift-storage-0\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.466969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-nb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.466983 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-sb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h2tb\" (UniqueName: \"kubernetes.io/projected/d8270ba1-30fc-45a6-870e-89853cba5d2b-kube-api-access-6h2tb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467212 4735 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467227 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqjtk\" (UniqueName: \"kubernetes.io/projected/d33bed52-756e-49da-8671-8cadf9dc3bcf-kube-api-access-tqjtk\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467240 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d33bed52-756e-49da-8671-8cadf9dc3bcf-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467599 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-config\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-svc\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.467986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-sb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.469551 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-swift-storage-0\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.487939 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h2tb\" (UniqueName: \"kubernetes.io/projected/d8270ba1-30fc-45a6-870e-89853cba5d2b-kube-api-access-6h2tb\") pod \"dnsmasq-dns-564965cbfc-8hd2q\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.537757 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.865920 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt-config-v6w7j" Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.865926 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt-config-v6w7j" event={"ID":"d33bed52-756e-49da-8671-8cadf9dc3bcf","Type":"ContainerDied","Data":"8adc48f174ec93d4876f8b45b5444d75106515b1187ae726ffff74b1f0ee0e9c"} Oct 08 13:14:51 crc kubenswrapper[4735]: I1008 13:14:51.866188 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8adc48f174ec93d4876f8b45b5444d75106515b1187ae726ffff74b1f0ee0e9c" Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.053698 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-8hd2q"] Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.319938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-pbkkt" Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.328061 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pbkkt-config-v6w7j"] Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.338190 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pbkkt-config-v6w7j"] Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.877379 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerID="fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333" exitCode=0 Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.877421 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" event={"ID":"d8270ba1-30fc-45a6-870e-89853cba5d2b","Type":"ContainerDied","Data":"fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333"} Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.877451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" event={"ID":"d8270ba1-30fc-45a6-870e-89853cba5d2b","Type":"ContainerStarted","Data":"3c48685af7d86a087352794a30a5733d8bed8149767d8276145ac3a0704a09f9"} Oct 08 13:14:52 crc kubenswrapper[4735]: I1008 13:14:52.914928 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33bed52-756e-49da-8671-8cadf9dc3bcf" path="/var/lib/kubelet/pods/d33bed52-756e-49da-8671-8cadf9dc3bcf/volumes" Oct 08 13:14:53 crc kubenswrapper[4735]: I1008 13:14:53.887181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" event={"ID":"d8270ba1-30fc-45a6-870e-89853cba5d2b","Type":"ContainerStarted","Data":"c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902"} Oct 08 13:14:53 crc kubenswrapper[4735]: I1008 13:14:53.887580 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:14:53 crc kubenswrapper[4735]: I1008 13:14:53.906690 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" podStartSLOduration=2.9066770010000003 podStartE2EDuration="2.906677001s" podCreationTimestamp="2025-10-08 13:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:14:53.904816472 +0000 UTC m=+1143.713121016" watchObservedRunningTime="2025-10-08 13:14:53.906677001 +0000 UTC m=+1143.714981545" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.205709 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.491273 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qcsss"] Oct 08 13:14:58 crc kubenswrapper[4735]: E1008 13:14:58.491721 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33bed52-756e-49da-8671-8cadf9dc3bcf" containerName="ovn-config" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.491744 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33bed52-756e-49da-8671-8cadf9dc3bcf" containerName="ovn-config" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.491960 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33bed52-756e-49da-8671-8cadf9dc3bcf" containerName="ovn-config" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.494512 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qcsss" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.504939 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qcsss"] Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.554161 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.592616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8tmd\" (UniqueName: \"kubernetes.io/projected/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc-kube-api-access-v8tmd\") pod \"cinder-db-create-qcsss\" (UID: \"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc\") " pod="openstack/cinder-db-create-qcsss" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.596582 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pk6wn"] Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.597792 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pk6wn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.618366 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pk6wn"] Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.694001 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hfp5\" (UniqueName: \"kubernetes.io/projected/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69-kube-api-access-4hfp5\") pod \"barbican-db-create-pk6wn\" (UID: \"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69\") " pod="openstack/barbican-db-create-pk6wn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.694303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8tmd\" (UniqueName: \"kubernetes.io/projected/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc-kube-api-access-v8tmd\") pod \"cinder-db-create-qcsss\" (UID: \"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc\") " pod="openstack/cinder-db-create-qcsss" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.734384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8tmd\" (UniqueName: \"kubernetes.io/projected/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc-kube-api-access-v8tmd\") pod \"cinder-db-create-qcsss\" (UID: \"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc\") " pod="openstack/cinder-db-create-qcsss" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.793762 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-bl6gn"] Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.795107 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bl6gn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.795518 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hfp5\" (UniqueName: \"kubernetes.io/projected/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69-kube-api-access-4hfp5\") pod \"barbican-db-create-pk6wn\" (UID: \"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69\") " pod="openstack/barbican-db-create-pk6wn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.809903 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bl6gn"] Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.827925 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qcsss" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.837414 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hfp5\" (UniqueName: \"kubernetes.io/projected/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69-kube-api-access-4hfp5\") pod \"barbican-db-create-pk6wn\" (UID: \"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69\") " pod="openstack/barbican-db-create-pk6wn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.897217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79qq\" (UniqueName: \"kubernetes.io/projected/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd-kube-api-access-j79qq\") pod \"neutron-db-create-bl6gn\" (UID: \"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd\") " pod="openstack/neutron-db-create-bl6gn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.918375 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pk6wn" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.940202 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8ht8g"] Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.941229 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.943560 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.944306 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.944491 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.944673 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpn2p" Oct 08 13:14:58 crc kubenswrapper[4735]: I1008 13:14:58.959714 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8ht8g"] Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.001583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79qq\" (UniqueName: \"kubernetes.io/projected/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd-kube-api-access-j79qq\") pod \"neutron-db-create-bl6gn\" (UID: \"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd\") " pod="openstack/neutron-db-create-bl6gn" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.042188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79qq\" (UniqueName: \"kubernetes.io/projected/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd-kube-api-access-j79qq\") pod \"neutron-db-create-bl6gn\" (UID: \"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd\") " pod="openstack/neutron-db-create-bl6gn" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.102886 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-config-data\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.103225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-combined-ca-bundle\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.103363 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7nb\" (UniqueName: \"kubernetes.io/projected/b264ac2b-df16-4ddb-94ee-9d346b374ec7-kube-api-access-jb7nb\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.117521 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bl6gn" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.204810 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7nb\" (UniqueName: \"kubernetes.io/projected/b264ac2b-df16-4ddb-94ee-9d346b374ec7-kube-api-access-jb7nb\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.204906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-config-data\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.204938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-combined-ca-bundle\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.208606 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-config-data\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.218098 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-combined-ca-bundle\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.221216 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7nb\" (UniqueName: \"kubernetes.io/projected/b264ac2b-df16-4ddb-94ee-9d346b374ec7-kube-api-access-jb7nb\") pod \"keystone-db-sync-8ht8g\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:14:59 crc kubenswrapper[4735]: I1008 13:14:59.262293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.138077 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2"] Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.139793 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.141622 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.143588 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.163339 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2"] Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.222227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dd65ea9-bdfb-4d1d-b09e-816693069b07-secret-volume\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.222313 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnz4q\" (UniqueName: \"kubernetes.io/projected/9dd65ea9-bdfb-4d1d-b09e-816693069b07-kube-api-access-jnz4q\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.222373 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dd65ea9-bdfb-4d1d-b09e-816693069b07-config-volume\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.324083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dd65ea9-bdfb-4d1d-b09e-816693069b07-secret-volume\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.324159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnz4q\" (UniqueName: \"kubernetes.io/projected/9dd65ea9-bdfb-4d1d-b09e-816693069b07-kube-api-access-jnz4q\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.324203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dd65ea9-bdfb-4d1d-b09e-816693069b07-config-volume\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.325094 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dd65ea9-bdfb-4d1d-b09e-816693069b07-config-volume\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.328142 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dd65ea9-bdfb-4d1d-b09e-816693069b07-secret-volume\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.363522 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnz4q\" (UniqueName: \"kubernetes.io/projected/9dd65ea9-bdfb-4d1d-b09e-816693069b07-kube-api-access-jnz4q\") pod \"collect-profiles-29332155-s58r2\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:00 crc kubenswrapper[4735]: I1008 13:15:00.458510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:01 crc kubenswrapper[4735]: I1008 13:15:01.539681 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:15:01 crc kubenswrapper[4735]: I1008 13:15:01.605424 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-zv2bg"] Oct 08 13:15:01 crc kubenswrapper[4735]: I1008 13:15:01.606462 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="dnsmasq-dns" containerID="cri-o://d24591f84f2a215e0b89883b252d8bd0fa2853bfcd15f6e7b000372b02ec7e5d" gracePeriod=10 Oct 08 13:15:03 crc kubenswrapper[4735]: I1008 13:15:03.968797 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerID="d24591f84f2a215e0b89883b252d8bd0fa2853bfcd15f6e7b000372b02ec7e5d" exitCode=0 Oct 08 13:15:03 crc kubenswrapper[4735]: I1008 13:15:03.968830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" event={"ID":"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f","Type":"ContainerDied","Data":"d24591f84f2a215e0b89883b252d8bd0fa2853bfcd15f6e7b000372b02ec7e5d"} Oct 08 13:15:04 crc kubenswrapper[4735]: I1008 13:15:04.053277 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Oct 08 13:15:05 crc kubenswrapper[4735]: I1008 13:15:05.005945 4735 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poddbf29341-130d-4282-b4a2-31e745b9a6c1"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poddbf29341-130d-4282-b4a2-31e745b9a6c1] : Timed out while waiting for systemd to remove kubepods-besteffort-poddbf29341_130d_4282_b4a2_31e745b9a6c1.slice" Oct 08 13:15:07 crc kubenswrapper[4735]: E1008 13:15:07.509550 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:9ebf424d4107275a2e3f21f7a18ef257ff2f97c1298109ac7c802a5a4f4794f2" Oct 08 13:15:07 crc kubenswrapper[4735]: E1008 13:15:07.510031 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:9ebf424d4107275a2e3f21f7a18ef257ff2f97c1298109ac7c802a5a4f4794f2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbgdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-c8pnx_openstack(8f8ee345-a4ac-49aa-9eed-32a7cffb2790): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 13:15:07 crc kubenswrapper[4735]: E1008 13:15:07.512007 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-c8pnx" podUID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.739180 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.855449 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-config\") pod \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.855753 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqh7z\" (UniqueName: \"kubernetes.io/projected/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-kube-api-access-lqh7z\") pod \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.855841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-dns-svc\") pod \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.855945 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-sb\") pod \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.856158 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-nb\") pod \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\" (UID: \"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f\") " Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.862043 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-kube-api-access-lqh7z" (OuterVolumeSpecName: "kube-api-access-lqh7z") pod "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" (UID: "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f"). InnerVolumeSpecName "kube-api-access-lqh7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.903379 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-config" (OuterVolumeSpecName: "config") pod "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" (UID: "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.905647 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" (UID: "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.906096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" (UID: "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.916195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" (UID: "3b3f87a3-e916-4f02-9a1d-730a6bc5a71f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.958280 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.958311 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqh7z\" (UniqueName: \"kubernetes.io/projected/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-kube-api-access-lqh7z\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.958322 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.958331 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:07 crc kubenswrapper[4735]: I1008 13:15:07.958340 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.000959 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" event={"ID":"3b3f87a3-e916-4f02-9a1d-730a6bc5a71f","Type":"ContainerDied","Data":"1429be662ecde2743b263af090cb6ffcb17ea9de988d9c0389967d75f585729b"} Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.000987 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-zv2bg" Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.001021 4735 scope.go:117] "RemoveContainer" containerID="d24591f84f2a215e0b89883b252d8bd0fa2853bfcd15f6e7b000372b02ec7e5d" Oct 08 13:15:08 crc kubenswrapper[4735]: E1008 13:15:08.002529 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:9ebf424d4107275a2e3f21f7a18ef257ff2f97c1298109ac7c802a5a4f4794f2\\\"\"" pod="openstack/glance-db-sync-c8pnx" podUID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.028761 4735 scope.go:117] "RemoveContainer" containerID="9728313bd36002d3cd4e2999764327234943fef3d025e7a717ec24bac3d7848d" Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.056745 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-zv2bg"] Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.063504 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-zv2bg"] Oct 08 13:15:08 crc kubenswrapper[4735]: W1008 13:15:08.073195 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e985c68_b9b2_4bb0_bf30_85abd2ea3c69.slice/crio-00d38d09950fcc257d1f5a0f77cf52591d427541fae59b35e285d6de105f3133 WatchSource:0}: Error finding container 00d38d09950fcc257d1f5a0f77cf52591d427541fae59b35e285d6de105f3133: Status 404 returned error can't find the container with id 00d38d09950fcc257d1f5a0f77cf52591d427541fae59b35e285d6de105f3133 Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.073220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pk6wn"] Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.080040 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8ht8g"] Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.086830 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qcsss"] Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.179624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bl6gn"] Oct 08 13:15:08 crc kubenswrapper[4735]: W1008 13:15:08.183126 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd65ea9_bdfb_4d1d_b09e_816693069b07.slice/crio-59c34e115af69de6924bdcb649f405ddcfed9cdaae068fa2915218dee6c57931 WatchSource:0}: Error finding container 59c34e115af69de6924bdcb649f405ddcfed9cdaae068fa2915218dee6c57931: Status 404 returned error can't find the container with id 59c34e115af69de6924bdcb649f405ddcfed9cdaae068fa2915218dee6c57931 Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.188721 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2"] Oct 08 13:15:08 crc kubenswrapper[4735]: I1008 13:15:08.905389 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" path="/var/lib/kubelet/pods/3b3f87a3-e916-4f02-9a1d-730a6bc5a71f/volumes" Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.009648 4735 generic.go:334] "Generic (PLEG): container finished" podID="9dd65ea9-bdfb-4d1d-b09e-816693069b07" containerID="272e4b3b7ef5916e8d77733e1bc56a06e768dc5c9767d56714b089a41fe76265" exitCode=0 Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.009727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" event={"ID":"9dd65ea9-bdfb-4d1d-b09e-816693069b07","Type":"ContainerDied","Data":"272e4b3b7ef5916e8d77733e1bc56a06e768dc5c9767d56714b089a41fe76265"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.009757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" event={"ID":"9dd65ea9-bdfb-4d1d-b09e-816693069b07","Type":"ContainerStarted","Data":"59c34e115af69de6924bdcb649f405ddcfed9cdaae068fa2915218dee6c57931"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.011028 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ht8g" event={"ID":"b264ac2b-df16-4ddb-94ee-9d346b374ec7","Type":"ContainerStarted","Data":"1787573f6f57f40961ef7c365ed90f0e381f0f021c219bc651cae9af05a61c2f"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.013626 4735 generic.go:334] "Generic (PLEG): container finished" podID="0d8f3c97-c358-4c3f-93fb-063e28ea1ccd" containerID="331ca35792c751b799fe82a15c0ba601de01652826733a0f1bc7a6370b61c05c" exitCode=0 Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.013695 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bl6gn" event={"ID":"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd","Type":"ContainerDied","Data":"331ca35792c751b799fe82a15c0ba601de01652826733a0f1bc7a6370b61c05c"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.013752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bl6gn" event={"ID":"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd","Type":"ContainerStarted","Data":"31196cf7533fc2dbe2f2b010bcfd917da439841c2eb46eea79c9146b5e1d002c"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.015350 4735 generic.go:334] "Generic (PLEG): container finished" podID="0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc" containerID="d44500aa9959aa2af8745d6e474b68bd1fe99246472ccf2f9cfe86cbabc06b44" exitCode=0 Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.015458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qcsss" event={"ID":"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc","Type":"ContainerDied","Data":"d44500aa9959aa2af8745d6e474b68bd1fe99246472ccf2f9cfe86cbabc06b44"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.015549 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qcsss" event={"ID":"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc","Type":"ContainerStarted","Data":"fc8fd2c16797123238c7c38a8d706f8a278cb3283e07b57795b8ac0cd02e1173"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.017299 4735 generic.go:334] "Generic (PLEG): container finished" podID="0e985c68-b9b2-4bb0-bf30-85abd2ea3c69" containerID="d8c14d818fd4ac6e52acfc43062590356f1e962dabe5c30592a2eb3027552ffb" exitCode=0 Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.017348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pk6wn" event={"ID":"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69","Type":"ContainerDied","Data":"d8c14d818fd4ac6e52acfc43062590356f1e962dabe5c30592a2eb3027552ffb"} Oct 08 13:15:09 crc kubenswrapper[4735]: I1008 13:15:09.017394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pk6wn" event={"ID":"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69","Type":"ContainerStarted","Data":"00d38d09950fcc257d1f5a0f77cf52591d427541fae59b35e285d6de105f3133"} Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.602974 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pk6wn" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.609287 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qcsss" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.622504 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.653849 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bl6gn" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.752009 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j79qq\" (UniqueName: \"kubernetes.io/projected/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd-kube-api-access-j79qq\") pod \"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd\" (UID: \"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd\") " Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.752533 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dd65ea9-bdfb-4d1d-b09e-816693069b07-config-volume\") pod \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.752598 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hfp5\" (UniqueName: \"kubernetes.io/projected/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69-kube-api-access-4hfp5\") pod \"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69\" (UID: \"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69\") " Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.752631 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dd65ea9-bdfb-4d1d-b09e-816693069b07-secret-volume\") pod \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.752657 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnz4q\" (UniqueName: \"kubernetes.io/projected/9dd65ea9-bdfb-4d1d-b09e-816693069b07-kube-api-access-jnz4q\") pod \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\" (UID: \"9dd65ea9-bdfb-4d1d-b09e-816693069b07\") " Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.752688 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8tmd\" (UniqueName: \"kubernetes.io/projected/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc-kube-api-access-v8tmd\") pod \"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc\" (UID: \"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc\") " Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.753046 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd65ea9-bdfb-4d1d-b09e-816693069b07-config-volume" (OuterVolumeSpecName: "config-volume") pod "9dd65ea9-bdfb-4d1d-b09e-816693069b07" (UID: "9dd65ea9-bdfb-4d1d-b09e-816693069b07"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.756661 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc-kube-api-access-v8tmd" (OuterVolumeSpecName: "kube-api-access-v8tmd") pod "0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc" (UID: "0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc"). InnerVolumeSpecName "kube-api-access-v8tmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.757168 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd65ea9-bdfb-4d1d-b09e-816693069b07-kube-api-access-jnz4q" (OuterVolumeSpecName: "kube-api-access-jnz4q") pod "9dd65ea9-bdfb-4d1d-b09e-816693069b07" (UID: "9dd65ea9-bdfb-4d1d-b09e-816693069b07"). InnerVolumeSpecName "kube-api-access-jnz4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.757278 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd65ea9-bdfb-4d1d-b09e-816693069b07-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9dd65ea9-bdfb-4d1d-b09e-816693069b07" (UID: "9dd65ea9-bdfb-4d1d-b09e-816693069b07"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.757297 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd-kube-api-access-j79qq" (OuterVolumeSpecName: "kube-api-access-j79qq") pod "0d8f3c97-c358-4c3f-93fb-063e28ea1ccd" (UID: "0d8f3c97-c358-4c3f-93fb-063e28ea1ccd"). InnerVolumeSpecName "kube-api-access-j79qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.757364 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69-kube-api-access-4hfp5" (OuterVolumeSpecName: "kube-api-access-4hfp5") pod "0e985c68-b9b2-4bb0-bf30-85abd2ea3c69" (UID: "0e985c68-b9b2-4bb0-bf30-85abd2ea3c69"). InnerVolumeSpecName "kube-api-access-4hfp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.854332 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j79qq\" (UniqueName: \"kubernetes.io/projected/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd-kube-api-access-j79qq\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.854367 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9dd65ea9-bdfb-4d1d-b09e-816693069b07-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.854377 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hfp5\" (UniqueName: \"kubernetes.io/projected/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69-kube-api-access-4hfp5\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.854386 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9dd65ea9-bdfb-4d1d-b09e-816693069b07-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.854395 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnz4q\" (UniqueName: \"kubernetes.io/projected/9dd65ea9-bdfb-4d1d-b09e-816693069b07-kube-api-access-jnz4q\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:12.854404 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8tmd\" (UniqueName: \"kubernetes.io/projected/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc-kube-api-access-v8tmd\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.051964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qcsss" event={"ID":"0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc","Type":"ContainerDied","Data":"fc8fd2c16797123238c7c38a8d706f8a278cb3283e07b57795b8ac0cd02e1173"} Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.051997 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc8fd2c16797123238c7c38a8d706f8a278cb3283e07b57795b8ac0cd02e1173" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.052065 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qcsss" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.054059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pk6wn" event={"ID":"0e985c68-b9b2-4bb0-bf30-85abd2ea3c69","Type":"ContainerDied","Data":"00d38d09950fcc257d1f5a0f77cf52591d427541fae59b35e285d6de105f3133"} Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.054081 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00d38d09950fcc257d1f5a0f77cf52591d427541fae59b35e285d6de105f3133" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.054128 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pk6wn" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.056077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" event={"ID":"9dd65ea9-bdfb-4d1d-b09e-816693069b07","Type":"ContainerDied","Data":"59c34e115af69de6924bdcb649f405ddcfed9cdaae068fa2915218dee6c57931"} Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.056119 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c34e115af69de6924bdcb649f405ddcfed9cdaae068fa2915218dee6c57931" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.056167 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.058396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ht8g" event={"ID":"b264ac2b-df16-4ddb-94ee-9d346b374ec7","Type":"ContainerStarted","Data":"bba8d83bf71746e7c9b712ac5f41c4a3c7f8f79408257bfec81776b07434b105"} Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.060585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bl6gn" event={"ID":"0d8f3c97-c358-4c3f-93fb-063e28ea1ccd","Type":"ContainerDied","Data":"31196cf7533fc2dbe2f2b010bcfd917da439841c2eb46eea79c9146b5e1d002c"} Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.060610 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31196cf7533fc2dbe2f2b010bcfd917da439841c2eb46eea79c9146b5e1d002c" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.060640 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bl6gn" Oct 08 13:15:13 crc kubenswrapper[4735]: I1008 13:15:13.087546 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8ht8g" podStartSLOduration=10.711406811 podStartE2EDuration="15.08752727s" podCreationTimestamp="2025-10-08 13:14:58 +0000 UTC" firstStartedPulling="2025-10-08 13:15:08.083243213 +0000 UTC m=+1157.891547757" lastFinishedPulling="2025-10-08 13:15:12.459363672 +0000 UTC m=+1162.267668216" observedRunningTime="2025-10-08 13:15:13.081918172 +0000 UTC m=+1162.890222736" watchObservedRunningTime="2025-10-08 13:15:13.08752727 +0000 UTC m=+1162.895831824" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.832570 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-936c-account-create-rt7rb"] Oct 08 13:15:18 crc kubenswrapper[4735]: E1008 13:15:18.833345 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8f3c97-c358-4c3f-93fb-063e28ea1ccd" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833356 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8f3c97-c358-4c3f-93fb-063e28ea1ccd" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: E1008 13:15:18.833369 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833376 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: E1008 13:15:18.833399 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e985c68-b9b2-4bb0-bf30-85abd2ea3c69" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833406 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e985c68-b9b2-4bb0-bf30-85abd2ea3c69" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: E1008 13:15:18.833413 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd65ea9-bdfb-4d1d-b09e-816693069b07" containerName="collect-profiles" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833419 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd65ea9-bdfb-4d1d-b09e-816693069b07" containerName="collect-profiles" Oct 08 13:15:18 crc kubenswrapper[4735]: E1008 13:15:18.833430 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="init" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833435 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="init" Oct 08 13:15:18 crc kubenswrapper[4735]: E1008 13:15:18.833445 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="dnsmasq-dns" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833451 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="dnsmasq-dns" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833648 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3f87a3-e916-4f02-9a1d-730a6bc5a71f" containerName="dnsmasq-dns" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833665 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd65ea9-bdfb-4d1d-b09e-816693069b07" containerName="collect-profiles" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833685 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833698 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e985c68-b9b2-4bb0-bf30-85abd2ea3c69" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.833711 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8f3c97-c358-4c3f-93fb-063e28ea1ccd" containerName="mariadb-database-create" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.834278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.836268 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.847221 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-936c-account-create-rt7rb"] Oct 08 13:15:18 crc kubenswrapper[4735]: I1008 13:15:18.949193 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj4t2\" (UniqueName: \"kubernetes.io/projected/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2-kube-api-access-bj4t2\") pod \"neutron-936c-account-create-rt7rb\" (UID: \"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2\") " pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:19 crc kubenswrapper[4735]: I1008 13:15:19.050185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj4t2\" (UniqueName: \"kubernetes.io/projected/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2-kube-api-access-bj4t2\") pod \"neutron-936c-account-create-rt7rb\" (UID: \"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2\") " pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:19 crc kubenswrapper[4735]: I1008 13:15:19.073362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj4t2\" (UniqueName: \"kubernetes.io/projected/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2-kube-api-access-bj4t2\") pod \"neutron-936c-account-create-rt7rb\" (UID: \"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2\") " pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:19 crc kubenswrapper[4735]: I1008 13:15:19.152059 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:19 crc kubenswrapper[4735]: W1008 13:15:19.561277 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08919c9d_5826_4fdd_b4ea_e3d5ef927cc2.slice/crio-e8e1b6b63169ac64b1c5b47b7d1b6af42f3114c6258afe7e98c1425277e27011 WatchSource:0}: Error finding container e8e1b6b63169ac64b1c5b47b7d1b6af42f3114c6258afe7e98c1425277e27011: Status 404 returned error can't find the container with id e8e1b6b63169ac64b1c5b47b7d1b6af42f3114c6258afe7e98c1425277e27011 Oct 08 13:15:19 crc kubenswrapper[4735]: I1008 13:15:19.565139 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-936c-account-create-rt7rb"] Oct 08 13:15:20 crc kubenswrapper[4735]: I1008 13:15:20.115731 4735 generic.go:334] "Generic (PLEG): container finished" podID="b264ac2b-df16-4ddb-94ee-9d346b374ec7" containerID="bba8d83bf71746e7c9b712ac5f41c4a3c7f8f79408257bfec81776b07434b105" exitCode=0 Oct 08 13:15:20 crc kubenswrapper[4735]: I1008 13:15:20.115779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ht8g" event={"ID":"b264ac2b-df16-4ddb-94ee-9d346b374ec7","Type":"ContainerDied","Data":"bba8d83bf71746e7c9b712ac5f41c4a3c7f8f79408257bfec81776b07434b105"} Oct 08 13:15:20 crc kubenswrapper[4735]: I1008 13:15:20.118271 4735 generic.go:334] "Generic (PLEG): container finished" podID="08919c9d-5826-4fdd-b4ea-e3d5ef927cc2" containerID="ad46567e2456e5a5fb2080fb61bdaa993642c8e9b3b19d2bc7ea50152d6ea11f" exitCode=0 Oct 08 13:15:20 crc kubenswrapper[4735]: I1008 13:15:20.118311 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-936c-account-create-rt7rb" event={"ID":"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2","Type":"ContainerDied","Data":"ad46567e2456e5a5fb2080fb61bdaa993642c8e9b3b19d2bc7ea50152d6ea11f"} Oct 08 13:15:20 crc kubenswrapper[4735]: I1008 13:15:20.118331 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-936c-account-create-rt7rb" event={"ID":"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2","Type":"ContainerStarted","Data":"e8e1b6b63169ac64b1c5b47b7d1b6af42f3114c6258afe7e98c1425277e27011"} Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.461591 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.592165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-combined-ca-bundle\") pod \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.592650 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-config-data\") pod \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.592708 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7nb\" (UniqueName: \"kubernetes.io/projected/b264ac2b-df16-4ddb-94ee-9d346b374ec7-kube-api-access-jb7nb\") pod \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\" (UID: \"b264ac2b-df16-4ddb-94ee-9d346b374ec7\") " Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.597779 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b264ac2b-df16-4ddb-94ee-9d346b374ec7-kube-api-access-jb7nb" (OuterVolumeSpecName: "kube-api-access-jb7nb") pod "b264ac2b-df16-4ddb-94ee-9d346b374ec7" (UID: "b264ac2b-df16-4ddb-94ee-9d346b374ec7"). InnerVolumeSpecName "kube-api-access-jb7nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.626105 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b264ac2b-df16-4ddb-94ee-9d346b374ec7" (UID: "b264ac2b-df16-4ddb-94ee-9d346b374ec7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.666714 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-config-data" (OuterVolumeSpecName: "config-data") pod "b264ac2b-df16-4ddb-94ee-9d346b374ec7" (UID: "b264ac2b-df16-4ddb-94ee-9d346b374ec7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.672684 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.704485 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.704513 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7nb\" (UniqueName: \"kubernetes.io/projected/b264ac2b-df16-4ddb-94ee-9d346b374ec7-kube-api-access-jb7nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.704522 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ac2b-df16-4ddb-94ee-9d346b374ec7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.805421 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj4t2\" (UniqueName: \"kubernetes.io/projected/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2-kube-api-access-bj4t2\") pod \"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2\" (UID: \"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2\") " Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.808602 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2-kube-api-access-bj4t2" (OuterVolumeSpecName: "kube-api-access-bj4t2") pod "08919c9d-5826-4fdd-b4ea-e3d5ef927cc2" (UID: "08919c9d-5826-4fdd-b4ea-e3d5ef927cc2"). InnerVolumeSpecName "kube-api-access-bj4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:21 crc kubenswrapper[4735]: I1008 13:15:21.907758 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj4t2\" (UniqueName: \"kubernetes.io/projected/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2-kube-api-access-bj4t2\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.134974 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-936c-account-create-rt7rb" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.134966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-936c-account-create-rt7rb" event={"ID":"08919c9d-5826-4fdd-b4ea-e3d5ef927cc2","Type":"ContainerDied","Data":"e8e1b6b63169ac64b1c5b47b7d1b6af42f3114c6258afe7e98c1425277e27011"} Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.135345 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e1b6b63169ac64b1c5b47b7d1b6af42f3114c6258afe7e98c1425277e27011" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.136328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8ht8g" event={"ID":"b264ac2b-df16-4ddb-94ee-9d346b374ec7","Type":"ContainerDied","Data":"1787573f6f57f40961ef7c365ed90f0e381f0f021c219bc651cae9af05a61c2f"} Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.136370 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1787573f6f57f40961ef7c365ed90f0e381f0f021c219bc651cae9af05a61c2f" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.136394 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8ht8g" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.407376 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-x6xff"] Oct 08 13:15:22 crc kubenswrapper[4735]: E1008 13:15:22.408260 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b264ac2b-df16-4ddb-94ee-9d346b374ec7" containerName="keystone-db-sync" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.408285 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b264ac2b-df16-4ddb-94ee-9d346b374ec7" containerName="keystone-db-sync" Oct 08 13:15:22 crc kubenswrapper[4735]: E1008 13:15:22.408306 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08919c9d-5826-4fdd-b4ea-e3d5ef927cc2" containerName="mariadb-account-create" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.408332 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="08919c9d-5826-4fdd-b4ea-e3d5ef927cc2" containerName="mariadb-account-create" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.408691 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="08919c9d-5826-4fdd-b4ea-e3d5ef927cc2" containerName="mariadb-account-create" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.408709 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b264ac2b-df16-4ddb-94ee-9d346b374ec7" containerName="keystone-db-sync" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.410167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.442404 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-x6xff"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.478398 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dhcjz"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.482643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.489661 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpn2p" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.489904 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.490157 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.490293 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.513625 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dhcjz"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.522891 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-swift-storage-0\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.522989 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-svc\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.523019 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-sb\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.523097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-nb\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.523147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9f4m\" (UniqueName: \"kubernetes.io/projected/634d4dd4-e1e7-41f0-a708-83a796081e57-kube-api-access-f9f4m\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.523214 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-config\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627530 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-scripts\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-config-data\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qd6\" (UniqueName: \"kubernetes.io/projected/a1e5dc90-6530-424d-8a2c-c22fe13c981d-kube-api-access-44qd6\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627698 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-nb\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9f4m\" (UniqueName: \"kubernetes.io/projected/634d4dd4-e1e7-41f0-a708-83a796081e57-kube-api-access-f9f4m\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627811 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-config\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-combined-ca-bundle\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.627938 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-credential-keys\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.628034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-fernet-keys\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.628065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-swift-storage-0\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.628140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-svc\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.628177 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-sb\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.628947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-swift-storage-0\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.629142 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-config\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.630925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-svc\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.631410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-nb\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.634314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-sb\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.665987 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.667875 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.672886 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.673190 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.674766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9f4m\" (UniqueName: \"kubernetes.io/projected/634d4dd4-e1e7-41f0-a708-83a796081e57-kube-api-access-f9f4m\") pod \"dnsmasq-dns-6877b6c9cc-x6xff\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.712373 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.732458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-combined-ca-bundle\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.732534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-credential-keys\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.732629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-fernet-keys\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.732681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-scripts\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.732708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-config-data\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.732728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qd6\" (UniqueName: \"kubernetes.io/projected/a1e5dc90-6530-424d-8a2c-c22fe13c981d-kube-api-access-44qd6\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.756906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.757947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-scripts\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.759053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-fernet-keys\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.759256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-combined-ca-bundle\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.794222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-config-data\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.798524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qd6\" (UniqueName: \"kubernetes.io/projected/a1e5dc90-6530-424d-8a2c-c22fe13c981d-kube-api-access-44qd6\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.830839 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fz5qt"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.849109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851202 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-config-data\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6hf\" (UniqueName: \"kubernetes.io/projected/328b1aa0-7e48-4aa0-b456-e1beefb886c0-kube-api-access-dx6hf\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851440 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-log-httpd\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851707 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-scripts\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851733 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-run-httpd\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.851800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.858334 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.858674 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.859287 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g2lts" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.864045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-credential-keys\") pod \"keystone-bootstrap-dhcjz\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.893619 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-x6xff"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.935897 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fz5qt"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.939184 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-zkqh2"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.941096 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.948251 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-zkqh2"] Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961230 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-config-data\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb6559ca-82f8-4271-bb6b-47f154089579-logs\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961338 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-scripts\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-combined-ca-bundle\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961371 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-run-httpd\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsf7k\" (UniqueName: \"kubernetes.io/projected/cb6559ca-82f8-4271-bb6b-47f154089579-kube-api-access-vsf7k\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-config-data\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961607 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6hf\" (UniqueName: \"kubernetes.io/projected/328b1aa0-7e48-4aa0-b456-e1beefb886c0-kube-api-access-dx6hf\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961664 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-log-httpd\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.961708 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-scripts\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.973104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-log-httpd\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.973643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-run-httpd\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.978556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.979341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-scripts\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.987523 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:22 crc kubenswrapper[4735]: I1008 13:15:22.992466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-config-data\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.013128 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6hf\" (UniqueName: \"kubernetes.io/projected/328b1aa0-7e48-4aa0-b456-e1beefb886c0-kube-api-access-dx6hf\") pod \"ceilometer-0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " pod="openstack/ceilometer-0" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.062984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-swift-storage-0\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063054 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-combined-ca-bundle\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsf7k\" (UniqueName: \"kubernetes.io/projected/cb6559ca-82f8-4271-bb6b-47f154089579-kube-api-access-vsf7k\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-svc\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063181 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5rs\" (UniqueName: \"kubernetes.io/projected/b5564411-c04a-4da8-a5c9-c38815d6aa2a-kube-api-access-8l5rs\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-config\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063265 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-nb\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063340 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-scripts\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063382 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-sb\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063418 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-config-data\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063456 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb6559ca-82f8-4271-bb6b-47f154089579-logs\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.063945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb6559ca-82f8-4271-bb6b-47f154089579-logs\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.068794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-combined-ca-bundle\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.068849 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-scripts\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.070219 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-config-data\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.084723 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsf7k\" (UniqueName: \"kubernetes.io/projected/cb6559ca-82f8-4271-bb6b-47f154089579-kube-api-access-vsf7k\") pod \"placement-db-sync-fz5qt\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.123863 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.151136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c8pnx" event={"ID":"8f8ee345-a4ac-49aa-9eed-32a7cffb2790","Type":"ContainerStarted","Data":"ba1c205cb24ae21c686b973dc27c7541bf68b8cd19fdea9dbc395f5136814148"} Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.195233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-swift-storage-0\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.195322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-svc\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.195369 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l5rs\" (UniqueName: \"kubernetes.io/projected/b5564411-c04a-4da8-a5c9-c38815d6aa2a-kube-api-access-8l5rs\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.195431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-config\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.195457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-nb\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.195549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-sb\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.196293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-swift-storage-0\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.196375 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-sb\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.196528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-svc\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.196516 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-c8pnx" podStartSLOduration=2.876629705 podStartE2EDuration="36.19650288s" podCreationTimestamp="2025-10-08 13:14:47 +0000 UTC" firstStartedPulling="2025-10-08 13:14:48.239589251 +0000 UTC m=+1138.047893795" lastFinishedPulling="2025-10-08 13:15:21.559462426 +0000 UTC m=+1171.367766970" observedRunningTime="2025-10-08 13:15:23.194540189 +0000 UTC m=+1173.002844733" watchObservedRunningTime="2025-10-08 13:15:23.19650288 +0000 UTC m=+1173.004807524" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.196998 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-config\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.197230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-nb\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.211167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.227924 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l5rs\" (UniqueName: \"kubernetes.io/projected/b5564411-c04a-4da8-a5c9-c38815d6aa2a-kube-api-access-8l5rs\") pod \"dnsmasq-dns-7d96c67b5-zkqh2\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.233267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.235097 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-x6xff"] Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.290889 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.621619 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dhcjz"] Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.802434 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.952508 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fz5qt"] Oct 08 13:15:23 crc kubenswrapper[4735]: W1008 13:15:23.953066 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb6559ca_82f8_4271_bb6b_47f154089579.slice/crio-b2955b0873661a5db4e8de33d9cbc50ca6c6ac2c438fec5ec5651c6b7e6db36c WatchSource:0}: Error finding container b2955b0873661a5db4e8de33d9cbc50ca6c6ac2c438fec5ec5651c6b7e6db36c: Status 404 returned error can't find the container with id b2955b0873661a5db4e8de33d9cbc50ca6c6ac2c438fec5ec5651c6b7e6db36c Oct 08 13:15:23 crc kubenswrapper[4735]: I1008 13:15:23.961144 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-zkqh2"] Oct 08 13:15:23 crc kubenswrapper[4735]: W1008 13:15:23.969272 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5564411_c04a_4da8_a5c9_c38815d6aa2a.slice/crio-2a9229a37653c84063c02f1db9471b68ec39bec63bd6498a655b3e280ed6e9a7 WatchSource:0}: Error finding container 2a9229a37653c84063c02f1db9471b68ec39bec63bd6498a655b3e280ed6e9a7: Status 404 returned error can't find the container with id 2a9229a37653c84063c02f1db9471b68ec39bec63bd6498a655b3e280ed6e9a7 Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.159521 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" event={"ID":"b5564411-c04a-4da8-a5c9-c38815d6aa2a","Type":"ContainerStarted","Data":"2a9229a37653c84063c02f1db9471b68ec39bec63bd6498a655b3e280ed6e9a7"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.161408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhcjz" event={"ID":"a1e5dc90-6530-424d-8a2c-c22fe13c981d","Type":"ContainerStarted","Data":"0dbfa050b749ea1a57d90f160065f4738b25279a7a23eb2307f4e32ca745f837"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.161456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhcjz" event={"ID":"a1e5dc90-6530-424d-8a2c-c22fe13c981d","Type":"ContainerStarted","Data":"b82a65edd9039d3e17bff79eefda512585bf361d80bdc2f856fae928a144d2ed"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.162955 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerStarted","Data":"4c1144ef5da3c6ecf4fbad63c94e2fc020cf67499ac798af52f7307bc75bb778"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.164629 4735 generic.go:334] "Generic (PLEG): container finished" podID="634d4dd4-e1e7-41f0-a708-83a796081e57" containerID="eeec3588532d6648d75ffce24cd3f23fc5408af1a09e989e23b4deaa7d7c1b21" exitCode=0 Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.164720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" event={"ID":"634d4dd4-e1e7-41f0-a708-83a796081e57","Type":"ContainerDied","Data":"eeec3588532d6648d75ffce24cd3f23fc5408af1a09e989e23b4deaa7d7c1b21"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.164767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" event={"ID":"634d4dd4-e1e7-41f0-a708-83a796081e57","Type":"ContainerStarted","Data":"3405ae5dd53392f1ac60dac37f0db8bc4ab07c252bfb2d202d596e9d81330484"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.165774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fz5qt" event={"ID":"cb6559ca-82f8-4271-bb6b-47f154089579","Type":"ContainerStarted","Data":"b2955b0873661a5db4e8de33d9cbc50ca6c6ac2c438fec5ec5651c6b7e6db36c"} Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.201815 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-d25cz"] Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.203179 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.215699 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.215726 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v64wm" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.215852 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.258604 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d25cz"] Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.266580 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dhcjz" podStartSLOduration=2.266553873 podStartE2EDuration="2.266553873s" podCreationTimestamp="2025-10-08 13:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:24.227938719 +0000 UTC m=+1174.036243263" watchObservedRunningTime="2025-10-08 13:15:24.266553873 +0000 UTC m=+1174.074858427" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.325004 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-combined-ca-bundle\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.325146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22wqv\" (UniqueName: \"kubernetes.io/projected/8fe64ebc-f229-4573-a67c-44c0d573c336-kube-api-access-22wqv\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.325332 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-config\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.427021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-config\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.427108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-combined-ca-bundle\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.427234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22wqv\" (UniqueName: \"kubernetes.io/projected/8fe64ebc-f229-4573-a67c-44c0d573c336-kube-api-access-22wqv\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.440387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-config\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.440434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-combined-ca-bundle\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.456213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22wqv\" (UniqueName: \"kubernetes.io/projected/8fe64ebc-f229-4573-a67c-44c0d573c336-kube-api-access-22wqv\") pod \"neutron-db-sync-d25cz\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.544453 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.738795 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.837531 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-sb\") pod \"634d4dd4-e1e7-41f0-a708-83a796081e57\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.837627 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-nb\") pod \"634d4dd4-e1e7-41f0-a708-83a796081e57\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.837772 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-swift-storage-0\") pod \"634d4dd4-e1e7-41f0-a708-83a796081e57\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.837824 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-config\") pod \"634d4dd4-e1e7-41f0-a708-83a796081e57\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.837873 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-svc\") pod \"634d4dd4-e1e7-41f0-a708-83a796081e57\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.837898 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9f4m\" (UniqueName: \"kubernetes.io/projected/634d4dd4-e1e7-41f0-a708-83a796081e57-kube-api-access-f9f4m\") pod \"634d4dd4-e1e7-41f0-a708-83a796081e57\" (UID: \"634d4dd4-e1e7-41f0-a708-83a796081e57\") " Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.850428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634d4dd4-e1e7-41f0-a708-83a796081e57-kube-api-access-f9f4m" (OuterVolumeSpecName: "kube-api-access-f9f4m") pod "634d4dd4-e1e7-41f0-a708-83a796081e57" (UID: "634d4dd4-e1e7-41f0-a708-83a796081e57"). InnerVolumeSpecName "kube-api-access-f9f4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.876319 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "634d4dd4-e1e7-41f0-a708-83a796081e57" (UID: "634d4dd4-e1e7-41f0-a708-83a796081e57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.877163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "634d4dd4-e1e7-41f0-a708-83a796081e57" (UID: "634d4dd4-e1e7-41f0-a708-83a796081e57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.881353 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-config" (OuterVolumeSpecName: "config") pod "634d4dd4-e1e7-41f0-a708-83a796081e57" (UID: "634d4dd4-e1e7-41f0-a708-83a796081e57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.892069 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "634d4dd4-e1e7-41f0-a708-83a796081e57" (UID: "634d4dd4-e1e7-41f0-a708-83a796081e57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.910707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "634d4dd4-e1e7-41f0-a708-83a796081e57" (UID: "634d4dd4-e1e7-41f0-a708-83a796081e57"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.940382 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.940430 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.940447 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.940459 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.940470 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9f4m\" (UniqueName: \"kubernetes.io/projected/634d4dd4-e1e7-41f0-a708-83a796081e57-kube-api-access-f9f4m\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:24 crc kubenswrapper[4735]: I1008 13:15:24.940510 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/634d4dd4-e1e7-41f0-a708-83a796081e57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.093091 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d25cz"] Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.126988 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.197146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" event={"ID":"634d4dd4-e1e7-41f0-a708-83a796081e57","Type":"ContainerDied","Data":"3405ae5dd53392f1ac60dac37f0db8bc4ab07c252bfb2d202d596e9d81330484"} Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.197225 4735 scope.go:117] "RemoveContainer" containerID="eeec3588532d6648d75ffce24cd3f23fc5408af1a09e989e23b4deaa7d7c1b21" Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.197296 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-x6xff" Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.208420 4735 generic.go:334] "Generic (PLEG): container finished" podID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerID="5a7fce3cae6877cf55183bdd9bfde95b53f673308a2059a10a5ec3c9c39f7bcd" exitCode=0 Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.208550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" event={"ID":"b5564411-c04a-4da8-a5c9-c38815d6aa2a","Type":"ContainerDied","Data":"5a7fce3cae6877cf55183bdd9bfde95b53f673308a2059a10a5ec3c9c39f7bcd"} Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.216850 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d25cz" event={"ID":"8fe64ebc-f229-4573-a67c-44c0d573c336","Type":"ContainerStarted","Data":"e0e3b7e4addda32f7902e1bfbedd0797d23b3bc63f43104df6d9b33448f04ce2"} Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.247855 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-x6xff"] Oct 08 13:15:25 crc kubenswrapper[4735]: I1008 13:15:25.254322 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-x6xff"] Oct 08 13:15:26 crc kubenswrapper[4735]: I1008 13:15:26.247729 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" event={"ID":"b5564411-c04a-4da8-a5c9-c38815d6aa2a","Type":"ContainerStarted","Data":"6e2c3827a8918513321c7ce8e2aafc90f1e5c3061537f80389aa59b275166990"} Oct 08 13:15:26 crc kubenswrapper[4735]: I1008 13:15:26.249568 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:26 crc kubenswrapper[4735]: I1008 13:15:26.255767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d25cz" event={"ID":"8fe64ebc-f229-4573-a67c-44c0d573c336","Type":"ContainerStarted","Data":"273e0d1ce3c1b0a4083ae611ee6898246e1579f6035cac87c7041c620acb2e9a"} Oct 08 13:15:26 crc kubenswrapper[4735]: I1008 13:15:26.268059 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" podStartSLOduration=4.268042172 podStartE2EDuration="4.268042172s" podCreationTimestamp="2025-10-08 13:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:26.267080507 +0000 UTC m=+1176.075385051" watchObservedRunningTime="2025-10-08 13:15:26.268042172 +0000 UTC m=+1176.076346736" Oct 08 13:15:26 crc kubenswrapper[4735]: I1008 13:15:26.299230 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-d25cz" podStartSLOduration=2.299200154 podStartE2EDuration="2.299200154s" podCreationTimestamp="2025-10-08 13:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:26.294289418 +0000 UTC m=+1176.102593982" watchObservedRunningTime="2025-10-08 13:15:26.299200154 +0000 UTC m=+1176.107504698" Oct 08 13:15:26 crc kubenswrapper[4735]: I1008 13:15:26.920453 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634d4dd4-e1e7-41f0-a708-83a796081e57" path="/var/lib/kubelet/pods/634d4dd4-e1e7-41f0-a708-83a796081e57/volumes" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.607574 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c85c-account-create-sjbsn"] Oct 08 13:15:28 crc kubenswrapper[4735]: E1008 13:15:28.608387 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634d4dd4-e1e7-41f0-a708-83a796081e57" containerName="init" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.608407 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="634d4dd4-e1e7-41f0-a708-83a796081e57" containerName="init" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.608657 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="634d4dd4-e1e7-41f0-a708-83a796081e57" containerName="init" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.609347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.611448 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.616741 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c85c-account-create-sjbsn"] Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.709983 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4fb6-account-create-wgj75"] Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.712310 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.714793 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.718234 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxn4t\" (UniqueName: \"kubernetes.io/projected/20a67a90-368f-48f4-809b-6db81964983e-kube-api-access-hxn4t\") pod \"barbican-c85c-account-create-sjbsn\" (UID: \"20a67a90-368f-48f4-809b-6db81964983e\") " pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.724401 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4fb6-account-create-wgj75"] Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.819552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxn4t\" (UniqueName: \"kubernetes.io/projected/20a67a90-368f-48f4-809b-6db81964983e-kube-api-access-hxn4t\") pod \"barbican-c85c-account-create-sjbsn\" (UID: \"20a67a90-368f-48f4-809b-6db81964983e\") " pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.819606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv86w\" (UniqueName: \"kubernetes.io/projected/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83-kube-api-access-hv86w\") pod \"cinder-4fb6-account-create-wgj75\" (UID: \"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83\") " pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.840176 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxn4t\" (UniqueName: \"kubernetes.io/projected/20a67a90-368f-48f4-809b-6db81964983e-kube-api-access-hxn4t\") pod \"barbican-c85c-account-create-sjbsn\" (UID: \"20a67a90-368f-48f4-809b-6db81964983e\") " pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.921409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv86w\" (UniqueName: \"kubernetes.io/projected/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83-kube-api-access-hv86w\") pod \"cinder-4fb6-account-create-wgj75\" (UID: \"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83\") " pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.938241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv86w\" (UniqueName: \"kubernetes.io/projected/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83-kube-api-access-hv86w\") pod \"cinder-4fb6-account-create-wgj75\" (UID: \"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83\") " pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:28 crc kubenswrapper[4735]: I1008 13:15:28.940831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:29 crc kubenswrapper[4735]: I1008 13:15:29.031706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:29 crc kubenswrapper[4735]: I1008 13:15:29.289177 4735 generic.go:334] "Generic (PLEG): container finished" podID="a1e5dc90-6530-424d-8a2c-c22fe13c981d" containerID="0dbfa050b749ea1a57d90f160065f4738b25279a7a23eb2307f4e32ca745f837" exitCode=0 Oct 08 13:15:29 crc kubenswrapper[4735]: I1008 13:15:29.289229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhcjz" event={"ID":"a1e5dc90-6530-424d-8a2c-c22fe13c981d","Type":"ContainerDied","Data":"0dbfa050b749ea1a57d90f160065f4738b25279a7a23eb2307f4e32ca745f837"} Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.327006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhcjz" event={"ID":"a1e5dc90-6530-424d-8a2c-c22fe13c981d","Type":"ContainerDied","Data":"b82a65edd9039d3e17bff79eefda512585bf361d80bdc2f856fae928a144d2ed"} Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.327634 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82a65edd9039d3e17bff79eefda512585bf361d80bdc2f856fae928a144d2ed" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.580204 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.683584 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-combined-ca-bundle\") pod \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.683647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-config-data\") pod \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.683684 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-fernet-keys\") pod \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.683718 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-credential-keys\") pod \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.683778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-scripts\") pod \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.683796 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qd6\" (UniqueName: \"kubernetes.io/projected/a1e5dc90-6530-424d-8a2c-c22fe13c981d-kube-api-access-44qd6\") pod \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\" (UID: \"a1e5dc90-6530-424d-8a2c-c22fe13c981d\") " Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.689266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-scripts" (OuterVolumeSpecName: "scripts") pod "a1e5dc90-6530-424d-8a2c-c22fe13c981d" (UID: "a1e5dc90-6530-424d-8a2c-c22fe13c981d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.689308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a1e5dc90-6530-424d-8a2c-c22fe13c981d" (UID: "a1e5dc90-6530-424d-8a2c-c22fe13c981d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.690352 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a1e5dc90-6530-424d-8a2c-c22fe13c981d" (UID: "a1e5dc90-6530-424d-8a2c-c22fe13c981d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.691038 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e5dc90-6530-424d-8a2c-c22fe13c981d-kube-api-access-44qd6" (OuterVolumeSpecName: "kube-api-access-44qd6") pod "a1e5dc90-6530-424d-8a2c-c22fe13c981d" (UID: "a1e5dc90-6530-424d-8a2c-c22fe13c981d"). InnerVolumeSpecName "kube-api-access-44qd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.715144 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-config-data" (OuterVolumeSpecName: "config-data") pod "a1e5dc90-6530-424d-8a2c-c22fe13c981d" (UID: "a1e5dc90-6530-424d-8a2c-c22fe13c981d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.721424 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1e5dc90-6530-424d-8a2c-c22fe13c981d" (UID: "a1e5dc90-6530-424d-8a2c-c22fe13c981d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.767167 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4fb6-account-create-wgj75"] Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.786374 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.786409 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.786421 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qd6\" (UniqueName: \"kubernetes.io/projected/a1e5dc90-6530-424d-8a2c-c22fe13c981d-kube-api-access-44qd6\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.786435 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.786448 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.786458 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1e5dc90-6530-424d-8a2c-c22fe13c981d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:31 crc kubenswrapper[4735]: W1008 13:15:31.870353 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a67a90_368f_48f4_809b_6db81964983e.slice/crio-01588d5563a7b5c31cce3f2324b9dc01016b4174cdaec0c1e9ea31089263e15b WatchSource:0}: Error finding container 01588d5563a7b5c31cce3f2324b9dc01016b4174cdaec0c1e9ea31089263e15b: Status 404 returned error can't find the container with id 01588d5563a7b5c31cce3f2324b9dc01016b4174cdaec0c1e9ea31089263e15b Oct 08 13:15:31 crc kubenswrapper[4735]: I1008 13:15:31.874247 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c85c-account-create-sjbsn"] Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.336924 4735 generic.go:334] "Generic (PLEG): container finished" podID="20a67a90-368f-48f4-809b-6db81964983e" containerID="b4a38bdd8c366bbada82e4050eef0f1ba2e019fa6882c001d2e9d49612679698" exitCode=0 Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.337025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c85c-account-create-sjbsn" event={"ID":"20a67a90-368f-48f4-809b-6db81964983e","Type":"ContainerDied","Data":"b4a38bdd8c366bbada82e4050eef0f1ba2e019fa6882c001d2e9d49612679698"} Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.337058 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c85c-account-create-sjbsn" event={"ID":"20a67a90-368f-48f4-809b-6db81964983e","Type":"ContainerStarted","Data":"01588d5563a7b5c31cce3f2324b9dc01016b4174cdaec0c1e9ea31089263e15b"} Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.338628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerStarted","Data":"69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84"} Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.340110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fz5qt" event={"ID":"cb6559ca-82f8-4271-bb6b-47f154089579","Type":"ContainerStarted","Data":"9394c36d6bc15b7aa1942d29aa09ff824a385998e4098ce00e974bd466bdecfa"} Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.341465 4735 generic.go:334] "Generic (PLEG): container finished" podID="df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83" containerID="dfdd55aeb2694674bcacaeeb6387ca20d7fdd7bb448499e8470ad6f6af7c1dbc" exitCode=0 Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.341500 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4fb6-account-create-wgj75" event={"ID":"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83","Type":"ContainerDied","Data":"dfdd55aeb2694674bcacaeeb6387ca20d7fdd7bb448499e8470ad6f6af7c1dbc"} Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.341531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4fb6-account-create-wgj75" event={"ID":"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83","Type":"ContainerStarted","Data":"432e4f86960680e446c24d1e8ec6c3c306e5e4528c9964e87e39dfbc5f534242"} Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.341571 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhcjz" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.401529 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fz5qt" podStartSLOduration=3.039825785 podStartE2EDuration="10.401508748s" podCreationTimestamp="2025-10-08 13:15:22 +0000 UTC" firstStartedPulling="2025-10-08 13:15:23.956297535 +0000 UTC m=+1173.764602079" lastFinishedPulling="2025-10-08 13:15:31.317980498 +0000 UTC m=+1181.126285042" observedRunningTime="2025-10-08 13:15:32.396037427 +0000 UTC m=+1182.204341981" watchObservedRunningTime="2025-10-08 13:15:32.401508748 +0000 UTC m=+1182.209813302" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.720742 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dhcjz"] Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.729613 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dhcjz"] Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.812582 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6cx9z"] Oct 08 13:15:32 crc kubenswrapper[4735]: E1008 13:15:32.812998 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e5dc90-6530-424d-8a2c-c22fe13c981d" containerName="keystone-bootstrap" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.813013 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e5dc90-6530-424d-8a2c-c22fe13c981d" containerName="keystone-bootstrap" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.813238 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e5dc90-6530-424d-8a2c-c22fe13c981d" containerName="keystone-bootstrap" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.813937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.817155 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.817383 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpn2p" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.817507 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.818204 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.825968 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6cx9z"] Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906504 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e5dc90-6530-424d-8a2c-c22fe13c981d" path="/var/lib/kubelet/pods/a1e5dc90-6530-424d-8a2c-c22fe13c981d/volumes" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-config-data\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906691 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-scripts\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-fernet-keys\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-credential-keys\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906859 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-combined-ca-bundle\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:32 crc kubenswrapper[4735]: I1008 13:15:32.906907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h25jj\" (UniqueName: \"kubernetes.io/projected/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-kube-api-access-h25jj\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.007912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h25jj\" (UniqueName: \"kubernetes.io/projected/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-kube-api-access-h25jj\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.008040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-config-data\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.008060 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-scripts\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.008122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-fernet-keys\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.008138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-credential-keys\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.008223 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-combined-ca-bundle\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.015984 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-fernet-keys\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.017196 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-combined-ca-bundle\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.023115 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-scripts\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.024866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-config-data\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.025125 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-credential-keys\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.027834 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h25jj\" (UniqueName: \"kubernetes.io/projected/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-kube-api-access-h25jj\") pod \"keystone-bootstrap-6cx9z\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.134766 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.292670 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.373774 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-8hd2q"] Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.374052 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerName="dnsmasq-dns" containerID="cri-o://c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902" gracePeriod=10 Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.929763 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:33 crc kubenswrapper[4735]: I1008 13:15:33.947630 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.029187 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxn4t\" (UniqueName: \"kubernetes.io/projected/20a67a90-368f-48f4-809b-6db81964983e-kube-api-access-hxn4t\") pod \"20a67a90-368f-48f4-809b-6db81964983e\" (UID: \"20a67a90-368f-48f4-809b-6db81964983e\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.029443 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv86w\" (UniqueName: \"kubernetes.io/projected/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83-kube-api-access-hv86w\") pod \"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83\" (UID: \"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.035990 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a67a90-368f-48f4-809b-6db81964983e-kube-api-access-hxn4t" (OuterVolumeSpecName: "kube-api-access-hxn4t") pod "20a67a90-368f-48f4-809b-6db81964983e" (UID: "20a67a90-368f-48f4-809b-6db81964983e"). InnerVolumeSpecName "kube-api-access-hxn4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.036758 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83-kube-api-access-hv86w" (OuterVolumeSpecName: "kube-api-access-hv86w") pod "df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83" (UID: "df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83"). InnerVolumeSpecName "kube-api-access-hv86w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.134691 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxn4t\" (UniqueName: \"kubernetes.io/projected/20a67a90-368f-48f4-809b-6db81964983e-kube-api-access-hxn4t\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.134722 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv86w\" (UniqueName: \"kubernetes.io/projected/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83-kube-api-access-hv86w\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.281864 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6cx9z"] Oct 08 13:15:34 crc kubenswrapper[4735]: W1008 13:15:34.289304 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3f41c8_2dbe_4c57_8068_3db0ae0e44c4.slice/crio-fbaeda81a36b97c14ad781cc9fee6a778addabeb82b1d1845286875f724eb895 WatchSource:0}: Error finding container fbaeda81a36b97c14ad781cc9fee6a778addabeb82b1d1845286875f724eb895: Status 404 returned error can't find the container with id fbaeda81a36b97c14ad781cc9fee6a778addabeb82b1d1845286875f724eb895 Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.323326 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.369794 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4fb6-account-create-wgj75" event={"ID":"df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83","Type":"ContainerDied","Data":"432e4f86960680e446c24d1e8ec6c3c306e5e4528c9964e87e39dfbc5f534242"} Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.369837 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="432e4f86960680e446c24d1e8ec6c3c306e5e4528c9964e87e39dfbc5f534242" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.369907 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4fb6-account-create-wgj75" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.372460 4735 generic.go:334] "Generic (PLEG): container finished" podID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerID="c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902" exitCode=0 Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.372560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" event={"ID":"d8270ba1-30fc-45a6-870e-89853cba5d2b","Type":"ContainerDied","Data":"c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902"} Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.372593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" event={"ID":"d8270ba1-30fc-45a6-870e-89853cba5d2b","Type":"ContainerDied","Data":"3c48685af7d86a087352794a30a5733d8bed8149767d8276145ac3a0704a09f9"} Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.372614 4735 scope.go:117] "RemoveContainer" containerID="c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.372563 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-8hd2q" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.376530 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c85c-account-create-sjbsn" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.376761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c85c-account-create-sjbsn" event={"ID":"20a67a90-368f-48f4-809b-6db81964983e","Type":"ContainerDied","Data":"01588d5563a7b5c31cce3f2324b9dc01016b4174cdaec0c1e9ea31089263e15b"} Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.376872 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01588d5563a7b5c31cce3f2324b9dc01016b4174cdaec0c1e9ea31089263e15b" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.378515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6cx9z" event={"ID":"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4","Type":"ContainerStarted","Data":"fbaeda81a36b97c14ad781cc9fee6a778addabeb82b1d1845286875f724eb895"} Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.407377 4735 scope.go:117] "RemoveContainer" containerID="fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.439182 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-nb\") pod \"d8270ba1-30fc-45a6-870e-89853cba5d2b\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.439255 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-swift-storage-0\") pod \"d8270ba1-30fc-45a6-870e-89853cba5d2b\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.439300 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h2tb\" (UniqueName: \"kubernetes.io/projected/d8270ba1-30fc-45a6-870e-89853cba5d2b-kube-api-access-6h2tb\") pod \"d8270ba1-30fc-45a6-870e-89853cba5d2b\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.439353 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-svc\") pod \"d8270ba1-30fc-45a6-870e-89853cba5d2b\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.439393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-sb\") pod \"d8270ba1-30fc-45a6-870e-89853cba5d2b\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.439422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-config\") pod \"d8270ba1-30fc-45a6-870e-89853cba5d2b\" (UID: \"d8270ba1-30fc-45a6-870e-89853cba5d2b\") " Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.454828 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8270ba1-30fc-45a6-870e-89853cba5d2b-kube-api-access-6h2tb" (OuterVolumeSpecName: "kube-api-access-6h2tb") pod "d8270ba1-30fc-45a6-870e-89853cba5d2b" (UID: "d8270ba1-30fc-45a6-870e-89853cba5d2b"). InnerVolumeSpecName "kube-api-access-6h2tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.460157 4735 scope.go:117] "RemoveContainer" containerID="c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902" Oct 08 13:15:34 crc kubenswrapper[4735]: E1008 13:15:34.460841 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902\": container with ID starting with c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902 not found: ID does not exist" containerID="c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.460875 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902"} err="failed to get container status \"c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902\": rpc error: code = NotFound desc = could not find container \"c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902\": container with ID starting with c3c5ec8d7979557bacbe981c514dbadad3da58c4514bd9e7e7bf178bcc773902 not found: ID does not exist" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.460897 4735 scope.go:117] "RemoveContainer" containerID="fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333" Oct 08 13:15:34 crc kubenswrapper[4735]: E1008 13:15:34.461390 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333\": container with ID starting with fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333 not found: ID does not exist" containerID="fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.461443 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333"} err="failed to get container status \"fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333\": rpc error: code = NotFound desc = could not find container \"fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333\": container with ID starting with fa2a11192c5a09a015b00a65b1706e57cb4fbbb0c858687f1e8afe97b2876333 not found: ID does not exist" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.531710 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-config" (OuterVolumeSpecName: "config") pod "d8270ba1-30fc-45a6-870e-89853cba5d2b" (UID: "d8270ba1-30fc-45a6-870e-89853cba5d2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.533384 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8270ba1-30fc-45a6-870e-89853cba5d2b" (UID: "d8270ba1-30fc-45a6-870e-89853cba5d2b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.535439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8270ba1-30fc-45a6-870e-89853cba5d2b" (UID: "d8270ba1-30fc-45a6-870e-89853cba5d2b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.535751 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8270ba1-30fc-45a6-870e-89853cba5d2b" (UID: "d8270ba1-30fc-45a6-870e-89853cba5d2b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.541312 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.541352 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.541362 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h2tb\" (UniqueName: \"kubernetes.io/projected/d8270ba1-30fc-45a6-870e-89853cba5d2b-kube-api-access-6h2tb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.541372 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.541381 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.544243 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8270ba1-30fc-45a6-870e-89853cba5d2b" (UID: "d8270ba1-30fc-45a6-870e-89853cba5d2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.643196 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8270ba1-30fc-45a6-870e-89853cba5d2b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.708200 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-8hd2q"] Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.720597 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-8hd2q"] Oct 08 13:15:34 crc kubenswrapper[4735]: I1008 13:15:34.906610 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" path="/var/lib/kubelet/pods/d8270ba1-30fc-45a6-870e-89853cba5d2b/volumes" Oct 08 13:15:35 crc kubenswrapper[4735]: I1008 13:15:35.394766 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6cx9z" event={"ID":"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4","Type":"ContainerStarted","Data":"a1c40defa3c248eddcb567968509785acce19b04b60e353a43058057f878389b"} Oct 08 13:15:35 crc kubenswrapper[4735]: I1008 13:15:35.398055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerStarted","Data":"fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab"} Oct 08 13:15:35 crc kubenswrapper[4735]: I1008 13:15:35.399680 4735 generic.go:334] "Generic (PLEG): container finished" podID="cb6559ca-82f8-4271-bb6b-47f154089579" containerID="9394c36d6bc15b7aa1942d29aa09ff824a385998e4098ce00e974bd466bdecfa" exitCode=0 Oct 08 13:15:35 crc kubenswrapper[4735]: I1008 13:15:35.399765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fz5qt" event={"ID":"cb6559ca-82f8-4271-bb6b-47f154089579","Type":"ContainerDied","Data":"9394c36d6bc15b7aa1942d29aa09ff824a385998e4098ce00e974bd466bdecfa"} Oct 08 13:15:35 crc kubenswrapper[4735]: I1008 13:15:35.415290 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6cx9z" podStartSLOduration=3.415273962 podStartE2EDuration="3.415273962s" podCreationTimestamp="2025-10-08 13:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:35.409959145 +0000 UTC m=+1185.218263689" watchObservedRunningTime="2025-10-08 13:15:35.415273962 +0000 UTC m=+1185.223578506" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.784926 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.887701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-combined-ca-bundle\") pod \"cb6559ca-82f8-4271-bb6b-47f154089579\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.887775 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb6559ca-82f8-4271-bb6b-47f154089579-logs\") pod \"cb6559ca-82f8-4271-bb6b-47f154089579\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.887818 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-config-data\") pod \"cb6559ca-82f8-4271-bb6b-47f154089579\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.887848 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-scripts\") pod \"cb6559ca-82f8-4271-bb6b-47f154089579\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.887923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsf7k\" (UniqueName: \"kubernetes.io/projected/cb6559ca-82f8-4271-bb6b-47f154089579-kube-api-access-vsf7k\") pod \"cb6559ca-82f8-4271-bb6b-47f154089579\" (UID: \"cb6559ca-82f8-4271-bb6b-47f154089579\") " Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.888116 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb6559ca-82f8-4271-bb6b-47f154089579-logs" (OuterVolumeSpecName: "logs") pod "cb6559ca-82f8-4271-bb6b-47f154089579" (UID: "cb6559ca-82f8-4271-bb6b-47f154089579"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.888355 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb6559ca-82f8-4271-bb6b-47f154089579-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.901797 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6559ca-82f8-4271-bb6b-47f154089579-kube-api-access-vsf7k" (OuterVolumeSpecName: "kube-api-access-vsf7k") pod "cb6559ca-82f8-4271-bb6b-47f154089579" (UID: "cb6559ca-82f8-4271-bb6b-47f154089579"). InnerVolumeSpecName "kube-api-access-vsf7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.901829 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-scripts" (OuterVolumeSpecName: "scripts") pod "cb6559ca-82f8-4271-bb6b-47f154089579" (UID: "cb6559ca-82f8-4271-bb6b-47f154089579"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.917710 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb6559ca-82f8-4271-bb6b-47f154089579" (UID: "cb6559ca-82f8-4271-bb6b-47f154089579"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.919799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-config-data" (OuterVolumeSpecName: "config-data") pod "cb6559ca-82f8-4271-bb6b-47f154089579" (UID: "cb6559ca-82f8-4271-bb6b-47f154089579"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.990498 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.990548 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.990562 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6559ca-82f8-4271-bb6b-47f154089579-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:36 crc kubenswrapper[4735]: I1008 13:15:36.990576 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsf7k\" (UniqueName: \"kubernetes.io/projected/cb6559ca-82f8-4271-bb6b-47f154089579-kube-api-access-vsf7k\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.426049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fz5qt" event={"ID":"cb6559ca-82f8-4271-bb6b-47f154089579","Type":"ContainerDied","Data":"b2955b0873661a5db4e8de33d9cbc50ca6c6ac2c438fec5ec5651c6b7e6db36c"} Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.426094 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2955b0873661a5db4e8de33d9cbc50ca6c6ac2c438fec5ec5651c6b7e6db36c" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.426105 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fz5qt" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.545994 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8555f7d884-2km9p"] Oct 08 13:15:37 crc kubenswrapper[4735]: E1008 13:15:37.546360 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6559ca-82f8-4271-bb6b-47f154089579" containerName="placement-db-sync" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546375 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6559ca-82f8-4271-bb6b-47f154089579" containerName="placement-db-sync" Oct 08 13:15:37 crc kubenswrapper[4735]: E1008 13:15:37.546388 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83" containerName="mariadb-account-create" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546394 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83" containerName="mariadb-account-create" Oct 08 13:15:37 crc kubenswrapper[4735]: E1008 13:15:37.546409 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerName="dnsmasq-dns" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546416 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerName="dnsmasq-dns" Oct 08 13:15:37 crc kubenswrapper[4735]: E1008 13:15:37.546431 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a67a90-368f-48f4-809b-6db81964983e" containerName="mariadb-account-create" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546438 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a67a90-368f-48f4-809b-6db81964983e" containerName="mariadb-account-create" Oct 08 13:15:37 crc kubenswrapper[4735]: E1008 13:15:37.546455 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerName="init" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546462 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerName="init" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546662 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a67a90-368f-48f4-809b-6db81964983e" containerName="mariadb-account-create" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546678 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83" containerName="mariadb-account-create" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546694 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8270ba1-30fc-45a6-870e-89853cba5d2b" containerName="dnsmasq-dns" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.546704 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6559ca-82f8-4271-bb6b-47f154089579" containerName="placement-db-sync" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.547630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.551797 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.551839 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g2lts" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.551857 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.551857 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.552463 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.590956 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8555f7d884-2km9p"] Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.703721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-public-tls-certs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.704080 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-internal-tls-certs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.704424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdtgs\" (UniqueName: \"kubernetes.io/projected/70450186-9a74-4ba9-92f9-3f3a9626d0a0-kube-api-access-gdtgs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.704552 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-config-data\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.704701 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-scripts\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.704848 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-combined-ca-bundle\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.704967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70450186-9a74-4ba9-92f9-3f3a9626d0a0-logs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806586 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-scripts\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-combined-ca-bundle\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806700 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70450186-9a74-4ba9-92f9-3f3a9626d0a0-logs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806737 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-public-tls-certs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-internal-tls-certs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806835 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdtgs\" (UniqueName: \"kubernetes.io/projected/70450186-9a74-4ba9-92f9-3f3a9626d0a0-kube-api-access-gdtgs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.806872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-config-data\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.810053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70450186-9a74-4ba9-92f9-3f3a9626d0a0-logs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.812870 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-scripts\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.813112 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-combined-ca-bundle\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.813588 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-internal-tls-certs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.813656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-public-tls-certs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.826093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdtgs\" (UniqueName: \"kubernetes.io/projected/70450186-9a74-4ba9-92f9-3f3a9626d0a0-kube-api-access-gdtgs\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.836491 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-config-data\") pod \"placement-8555f7d884-2km9p\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:37 crc kubenswrapper[4735]: I1008 13:15:37.873595 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.439262 4735 generic.go:334] "Generic (PLEG): container finished" podID="9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" containerID="a1c40defa3c248eddcb567968509785acce19b04b60e353a43058057f878389b" exitCode=0 Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.439320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6cx9z" event={"ID":"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4","Type":"ContainerDied","Data":"a1c40defa3c248eddcb567968509785acce19b04b60e353a43058057f878389b"} Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.943227 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cdrp4"] Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.945321 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.948138 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j2pkq" Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.948406 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 13:15:38 crc kubenswrapper[4735]: I1008 13:15:38.954497 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cdrp4"] Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.033040 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b87k8\" (UniqueName: \"kubernetes.io/projected/e0c034b6-95c9-42d6-8490-0d759b602d82-kube-api-access-b87k8\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.033187 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-combined-ca-bundle\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.033251 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-db-sync-config-data\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.047603 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nplcf"] Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.048831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.050541 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.050642 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vzr64" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.051908 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.056641 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nplcf"] Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-combined-ca-bundle\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-scripts\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135234 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-db-sync-config-data\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dd43efe-433d-4590-bcf9-3304cd6e32fb-etc-machine-id\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mvxm\" (UniqueName: \"kubernetes.io/projected/2dd43efe-433d-4590-bcf9-3304cd6e32fb-kube-api-access-6mvxm\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-db-sync-config-data\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-config-data\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135610 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b87k8\" (UniqueName: \"kubernetes.io/projected/e0c034b6-95c9-42d6-8490-0d759b602d82-kube-api-access-b87k8\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.135647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-combined-ca-bundle\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.139741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-db-sync-config-data\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.145143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-combined-ca-bundle\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.159318 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b87k8\" (UniqueName: \"kubernetes.io/projected/e0c034b6-95c9-42d6-8490-0d759b602d82-kube-api-access-b87k8\") pod \"barbican-db-sync-cdrp4\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237174 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-scripts\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237216 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-db-sync-config-data\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237252 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dd43efe-433d-4590-bcf9-3304cd6e32fb-etc-machine-id\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237272 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mvxm\" (UniqueName: \"kubernetes.io/projected/2dd43efe-433d-4590-bcf9-3304cd6e32fb-kube-api-access-6mvxm\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-config-data\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-combined-ca-bundle\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.237459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dd43efe-433d-4590-bcf9-3304cd6e32fb-etc-machine-id\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.240892 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-db-sync-config-data\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.240953 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-scripts\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.241046 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-combined-ca-bundle\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.242994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-config-data\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.257647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mvxm\" (UniqueName: \"kubernetes.io/projected/2dd43efe-433d-4590-bcf9-3304cd6e32fb-kube-api-access-6mvxm\") pod \"cinder-db-sync-nplcf\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.278034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.395309 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nplcf" Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.452133 4735 generic.go:334] "Generic (PLEG): container finished" podID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" containerID="ba1c205cb24ae21c686b973dc27c7541bf68b8cd19fdea9dbc395f5136814148" exitCode=0 Oct 08 13:15:39 crc kubenswrapper[4735]: I1008 13:15:39.452351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c8pnx" event={"ID":"8f8ee345-a4ac-49aa-9eed-32a7cffb2790","Type":"ContainerDied","Data":"ba1c205cb24ae21c686b973dc27c7541bf68b8cd19fdea9dbc395f5136814148"} Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.610266 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.620651 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c8pnx" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.699998 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-config-data\") pod \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700088 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-combined-ca-bundle\") pod \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700127 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-combined-ca-bundle\") pod \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700239 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-config-data\") pod \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbgdb\" (UniqueName: \"kubernetes.io/projected/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-kube-api-access-rbgdb\") pod \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700308 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h25jj\" (UniqueName: \"kubernetes.io/projected/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-kube-api-access-h25jj\") pod \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-fernet-keys\") pod \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700376 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-db-sync-config-data\") pod \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\" (UID: \"8f8ee345-a4ac-49aa-9eed-32a7cffb2790\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700427 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-scripts\") pod \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.700498 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-credential-keys\") pod \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\" (UID: \"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4\") " Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.723704 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" (UID: "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.728880 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8f8ee345-a4ac-49aa-9eed-32a7cffb2790" (UID: "8f8ee345-a4ac-49aa-9eed-32a7cffb2790"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.728928 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-scripts" (OuterVolumeSpecName: "scripts") pod "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" (UID: "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.729033 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" (UID: "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.729693 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-kube-api-access-rbgdb" (OuterVolumeSpecName: "kube-api-access-rbgdb") pod "8f8ee345-a4ac-49aa-9eed-32a7cffb2790" (UID: "8f8ee345-a4ac-49aa-9eed-32a7cffb2790"). InnerVolumeSpecName "kube-api-access-rbgdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.730252 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-kube-api-access-h25jj" (OuterVolumeSpecName: "kube-api-access-h25jj") pod "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" (UID: "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4"). InnerVolumeSpecName "kube-api-access-h25jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.779675 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-config-data" (OuterVolumeSpecName: "config-data") pod "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" (UID: "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.788581 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" (UID: "9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.790463 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f8ee345-a4ac-49aa-9eed-32a7cffb2790" (UID: "8f8ee345-a4ac-49aa-9eed-32a7cffb2790"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804755 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804775 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804786 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804796 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804807 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804816 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbgdb\" (UniqueName: \"kubernetes.io/projected/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-kube-api-access-rbgdb\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804827 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h25jj\" (UniqueName: \"kubernetes.io/projected/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-kube-api-access-h25jj\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804835 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.804843 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.849450 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-config-data" (OuterVolumeSpecName: "config-data") pod "8f8ee345-a4ac-49aa-9eed-32a7cffb2790" (UID: "8f8ee345-a4ac-49aa-9eed-32a7cffb2790"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:41 crc kubenswrapper[4735]: I1008 13:15:41.910974 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8ee345-a4ac-49aa-9eed-32a7cffb2790-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.239091 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cdrp4"] Oct 08 13:15:42 crc kubenswrapper[4735]: W1008 13:15:42.245133 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0c034b6_95c9_42d6_8490_0d759b602d82.slice/crio-3dd7bcb415d4a88aba406ca03ec913ecfaf336f1a109b0f3d2dd3ab35335a9f4 WatchSource:0}: Error finding container 3dd7bcb415d4a88aba406ca03ec913ecfaf336f1a109b0f3d2dd3ab35335a9f4: Status 404 returned error can't find the container with id 3dd7bcb415d4a88aba406ca03ec913ecfaf336f1a109b0f3d2dd3ab35335a9f4 Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.328852 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nplcf"] Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.337257 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8555f7d884-2km9p"] Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.475887 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6cx9z" event={"ID":"9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4","Type":"ContainerDied","Data":"fbaeda81a36b97c14ad781cc9fee6a778addabeb82b1d1845286875f724eb895"} Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.475927 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbaeda81a36b97c14ad781cc9fee6a778addabeb82b1d1845286875f724eb895" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.476448 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6cx9z" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.477053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8555f7d884-2km9p" event={"ID":"70450186-9a74-4ba9-92f9-3f3a9626d0a0","Type":"ContainerStarted","Data":"4f54928f9a5c55b90c172f1c3b77d119b793f451d107666f7ef41d0a9de1c915"} Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.481162 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerStarted","Data":"d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b"} Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.482282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nplcf" event={"ID":"2dd43efe-433d-4590-bcf9-3304cd6e32fb","Type":"ContainerStarted","Data":"4fb97d60ce38a371f044ac6c75456387da39b07afc5a66da10e756201fbd36a4"} Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.483834 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c8pnx" event={"ID":"8f8ee345-a4ac-49aa-9eed-32a7cffb2790","Type":"ContainerDied","Data":"9dd15936a492e1181725f5900c2d068e1bcd017b408b4ece534ac96fea824457"} Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.483857 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd15936a492e1181725f5900c2d068e1bcd017b408b4ece534ac96fea824457" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.483906 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c8pnx" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.486646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdrp4" event={"ID":"e0c034b6-95c9-42d6-8490-0d759b602d82","Type":"ContainerStarted","Data":"3dd7bcb415d4a88aba406ca03ec913ecfaf336f1a109b0f3d2dd3ab35335a9f4"} Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.735088 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-77f649d57f-mlpp4"] Oct 08 13:15:42 crc kubenswrapper[4735]: E1008 13:15:42.735725 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" containerName="keystone-bootstrap" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.735737 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" containerName="keystone-bootstrap" Oct 08 13:15:42 crc kubenswrapper[4735]: E1008 13:15:42.735768 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" containerName="glance-db-sync" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.735774 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" containerName="glance-db-sync" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.735963 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" containerName="keystone-bootstrap" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.735980 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" containerName="glance-db-sync" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.736685 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.742075 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.742253 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.742344 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zpn2p" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.742532 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.742615 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.742876 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.762457 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77f649d57f-mlpp4"] Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.829993 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-credential-keys\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830057 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-internal-tls-certs\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-combined-ca-bundle\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830121 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-fernet-keys\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830157 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-scripts\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdz6w\" (UniqueName: \"kubernetes.io/projected/477f295c-5e96-4f47-8caa-f7d36b074cdd-kube-api-access-tdz6w\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830230 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-public-tls-certs\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.830248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-config-data\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.931674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-credential-keys\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-internal-tls-certs\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-combined-ca-bundle\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-fernet-keys\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-scripts\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdz6w\" (UniqueName: \"kubernetes.io/projected/477f295c-5e96-4f47-8caa-f7d36b074cdd-kube-api-access-tdz6w\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932149 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-public-tls-certs\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.932171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-config-data\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.943432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-internal-tls-certs\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.944372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-config-data\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.944777 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-credential-keys\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.946659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-combined-ca-bundle\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.953081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-scripts\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.959625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-public-tls-certs\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.960314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-fernet-keys\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:42 crc kubenswrapper[4735]: I1008 13:15:42.975170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdz6w\" (UniqueName: \"kubernetes.io/projected/477f295c-5e96-4f47-8caa-f7d36b074cdd-kube-api-access-tdz6w\") pod \"keystone-77f649d57f-mlpp4\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.076004 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.092012 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-89cn6"] Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.100772 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.122266 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-89cn6"] Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.240715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-svc\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.241184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.241350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.241491 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-config\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.241618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.241801 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrth\" (UniqueName: \"kubernetes.io/projected/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-kube-api-access-wmrth\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.344546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrth\" (UniqueName: \"kubernetes.io/projected/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-kube-api-access-wmrth\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.344629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-svc\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.344665 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.344723 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.344757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-config\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.344802 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.345985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.346644 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.347331 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.348159 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-svc\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.348468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-config\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.375249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrth\" (UniqueName: \"kubernetes.io/projected/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-kube-api-access-wmrth\") pod \"dnsmasq-dns-5dc68bd5-89cn6\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.437313 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.519296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8555f7d884-2km9p" event={"ID":"70450186-9a74-4ba9-92f9-3f3a9626d0a0","Type":"ContainerStarted","Data":"74e1faa845324974182d87cae1bcb7ea97976a1ed4553d25c3e02f8144d91a7f"} Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.519347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8555f7d884-2km9p" event={"ID":"70450186-9a74-4ba9-92f9-3f3a9626d0a0","Type":"ContainerStarted","Data":"c6fb989e32326dc473a47f896284b872f9d86ea0f3de39c8b6cff9255ec1e38f"} Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.520901 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.520935 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.610324 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8555f7d884-2km9p" podStartSLOduration=6.610298393 podStartE2EDuration="6.610298393s" podCreationTimestamp="2025-10-08 13:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:43.594468596 +0000 UTC m=+1193.402773140" watchObservedRunningTime="2025-10-08 13:15:43.610298393 +0000 UTC m=+1193.418602937" Oct 08 13:15:43 crc kubenswrapper[4735]: I1008 13:15:43.941170 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-77f649d57f-mlpp4"] Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.006857 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.011856 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.013854 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-76gcn" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.014367 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.014606 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.025952 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083460 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083568 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083599 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsszl\" (UniqueName: \"kubernetes.io/projected/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-kube-api-access-fsszl\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-logs\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.083728 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.131878 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-89cn6"] Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.184928 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185254 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185319 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185353 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185354 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsszl\" (UniqueName: \"kubernetes.io/projected/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-kube-api-access-fsszl\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-logs\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.185855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-logs\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.187079 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.191372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.204153 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.206593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.208949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsszl\" (UniqueName: \"kubernetes.io/projected/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-kube-api-access-fsszl\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.223867 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.372717 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.374542 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.378744 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.397944 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.414948 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490520 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490571 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-562l7\" (UniqueName: \"kubernetes.io/projected/ed035481-64c3-43d2-9b59-ca6ac63225cb-kube-api-access-562l7\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.490766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.532688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77f649d57f-mlpp4" event={"ID":"477f295c-5e96-4f47-8caa-f7d36b074cdd","Type":"ContainerStarted","Data":"682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f"} Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.533009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77f649d57f-mlpp4" event={"ID":"477f295c-5e96-4f47-8caa-f7d36b074cdd","Type":"ContainerStarted","Data":"62d5911bcbac7b32b1863f013daed4d8870e040d748ea093af68ec1c437e8771"} Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.534032 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.536165 4735 generic.go:334] "Generic (PLEG): container finished" podID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerID="f2d2917edfcdae896262be4cd1de7fce5374bd3c8b3e62fb0b67457014bbb1fa" exitCode=0 Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.537188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" event={"ID":"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7","Type":"ContainerDied","Data":"f2d2917edfcdae896262be4cd1de7fce5374bd3c8b3e62fb0b67457014bbb1fa"} Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.537227 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" event={"ID":"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7","Type":"ContainerStarted","Data":"eb7dd8c8243482cba024f1e98fde2bf4eff726e393bd94dfd14c1c82abf76ed7"} Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.560507 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-77f649d57f-mlpp4" podStartSLOduration=2.56046177 podStartE2EDuration="2.56046177s" podCreationTimestamp="2025-10-08 13:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:44.553946682 +0000 UTC m=+1194.362251226" watchObservedRunningTime="2025-10-08 13:15:44.56046177 +0000 UTC m=+1194.368766314" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.591801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.591885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-562l7\" (UniqueName: \"kubernetes.io/projected/ed035481-64c3-43d2-9b59-ca6ac63225cb-kube-api-access-562l7\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.591905 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.592033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.592066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.592100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.592166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.593821 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.602258 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.602556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.609298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.613643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.631676 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-562l7\" (UniqueName: \"kubernetes.io/projected/ed035481-64c3-43d2-9b59-ca6ac63225cb-kube-api-access-562l7\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.633726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.642728 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.708659 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:15:44 crc kubenswrapper[4735]: I1008 13:15:44.935445 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.406992 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:15:45 crc kubenswrapper[4735]: W1008 13:15:45.417688 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded035481_64c3_43d2_9b59_ca6ac63225cb.slice/crio-454086dcbbeeb9452aac056649cfeaa9c93e1cc5c2fbeb2e40a37164030f1e9e WatchSource:0}: Error finding container 454086dcbbeeb9452aac056649cfeaa9c93e1cc5c2fbeb2e40a37164030f1e9e: Status 404 returned error can't find the container with id 454086dcbbeeb9452aac056649cfeaa9c93e1cc5c2fbeb2e40a37164030f1e9e Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.545776 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed035481-64c3-43d2-9b59-ca6ac63225cb","Type":"ContainerStarted","Data":"454086dcbbeeb9452aac056649cfeaa9c93e1cc5c2fbeb2e40a37164030f1e9e"} Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.549146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" event={"ID":"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7","Type":"ContainerStarted","Data":"e359b1b1bb8b10520bccf2003e64ee5c5fe849c5b9fc17e005ca7f7931793bd8"} Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.550207 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.553857 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d","Type":"ContainerStarted","Data":"229f98a08dc34641a66dbc3886fce4429e2cd06f0f13162a9d479d9b06d6cebf"} Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.571792 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" podStartSLOduration=2.571776532 podStartE2EDuration="2.571776532s" podCreationTimestamp="2025-10-08 13:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:45.56901205 +0000 UTC m=+1195.377316594" watchObservedRunningTime="2025-10-08 13:15:45.571776532 +0000 UTC m=+1195.380081076" Oct 08 13:15:45 crc kubenswrapper[4735]: I1008 13:15:45.955510 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:15:46 crc kubenswrapper[4735]: I1008 13:15:46.102042 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:15:46 crc kubenswrapper[4735]: I1008 13:15:46.576152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d","Type":"ContainerStarted","Data":"929ea1e86d9777b07fcc6f4ac8dfd86bc08e1d420de2d79e937417ebc392a88f"} Oct 08 13:15:46 crc kubenswrapper[4735]: I1008 13:15:46.580762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed035481-64c3-43d2-9b59-ca6ac63225cb","Type":"ContainerStarted","Data":"4a282d2d7ef52bfbc5340fdb20f7f9aa86f104a1a4a7d26b15f4b3061e87eed4"} Oct 08 13:15:47 crc kubenswrapper[4735]: I1008 13:15:47.593953 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed035481-64c3-43d2-9b59-ca6ac63225cb","Type":"ContainerStarted","Data":"27689a69ad6db802af2e5dda026f9e8ee8a19078825d065876abe75532b99058"} Oct 08 13:15:47 crc kubenswrapper[4735]: I1008 13:15:47.597704 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d","Type":"ContainerStarted","Data":"d6ce858accfe43225f333527fea19184da4919604a0bee88ecb0d7209a53fefe"} Oct 08 13:15:47 crc kubenswrapper[4735]: I1008 13:15:47.597752 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-log" containerID="cri-o://929ea1e86d9777b07fcc6f4ac8dfd86bc08e1d420de2d79e937417ebc392a88f" gracePeriod=30 Oct 08 13:15:47 crc kubenswrapper[4735]: I1008 13:15:47.597826 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-httpd" containerID="cri-o://d6ce858accfe43225f333527fea19184da4919604a0bee88ecb0d7209a53fefe" gracePeriod=30 Oct 08 13:15:47 crc kubenswrapper[4735]: I1008 13:15:47.626368 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.626348156 podStartE2EDuration="5.626348156s" podCreationTimestamp="2025-10-08 13:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:47.623241026 +0000 UTC m=+1197.431545570" watchObservedRunningTime="2025-10-08 13:15:47.626348156 +0000 UTC m=+1197.434652700" Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.626869 4735 generic.go:334] "Generic (PLEG): container finished" podID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerID="d6ce858accfe43225f333527fea19184da4919604a0bee88ecb0d7209a53fefe" exitCode=0 Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.627288 4735 generic.go:334] "Generic (PLEG): container finished" podID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerID="929ea1e86d9777b07fcc6f4ac8dfd86bc08e1d420de2d79e937417ebc392a88f" exitCode=143 Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.626950 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d","Type":"ContainerDied","Data":"d6ce858accfe43225f333527fea19184da4919604a0bee88ecb0d7209a53fefe"} Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.627353 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d","Type":"ContainerDied","Data":"929ea1e86d9777b07fcc6f4ac8dfd86bc08e1d420de2d79e937417ebc392a88f"} Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.630428 4735 generic.go:334] "Generic (PLEG): container finished" podID="8fe64ebc-f229-4573-a67c-44c0d573c336" containerID="273e0d1ce3c1b0a4083ae611ee6898246e1579f6035cac87c7041c620acb2e9a" exitCode=0 Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.630516 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d25cz" event={"ID":"8fe64ebc-f229-4573-a67c-44c0d573c336","Type":"ContainerDied","Data":"273e0d1ce3c1b0a4083ae611ee6898246e1579f6035cac87c7041c620acb2e9a"} Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.630612 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-log" containerID="cri-o://4a282d2d7ef52bfbc5340fdb20f7f9aa86f104a1a4a7d26b15f4b3061e87eed4" gracePeriod=30 Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.630733 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-httpd" containerID="cri-o://27689a69ad6db802af2e5dda026f9e8ee8a19078825d065876abe75532b99058" gracePeriod=30 Oct 08 13:15:48 crc kubenswrapper[4735]: I1008 13:15:48.686176 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.686157707 podStartE2EDuration="5.686157707s" podCreationTimestamp="2025-10-08 13:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:15:48.676302283 +0000 UTC m=+1198.484606827" watchObservedRunningTime="2025-10-08 13:15:48.686157707 +0000 UTC m=+1198.494462251" Oct 08 13:15:49 crc kubenswrapper[4735]: I1008 13:15:49.640729 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerID="27689a69ad6db802af2e5dda026f9e8ee8a19078825d065876abe75532b99058" exitCode=0 Oct 08 13:15:49 crc kubenswrapper[4735]: I1008 13:15:49.640772 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerID="4a282d2d7ef52bfbc5340fdb20f7f9aa86f104a1a4a7d26b15f4b3061e87eed4" exitCode=143 Oct 08 13:15:49 crc kubenswrapper[4735]: I1008 13:15:49.640811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed035481-64c3-43d2-9b59-ca6ac63225cb","Type":"ContainerDied","Data":"27689a69ad6db802af2e5dda026f9e8ee8a19078825d065876abe75532b99058"} Oct 08 13:15:49 crc kubenswrapper[4735]: I1008 13:15:49.640885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed035481-64c3-43d2-9b59-ca6ac63225cb","Type":"ContainerDied","Data":"4a282d2d7ef52bfbc5340fdb20f7f9aa86f104a1a4a7d26b15f4b3061e87eed4"} Oct 08 13:15:52 crc kubenswrapper[4735]: I1008 13:15:52.920012 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.003640 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-combined-ca-bundle\") pod \"8fe64ebc-f229-4573-a67c-44c0d573c336\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.003708 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-config\") pod \"8fe64ebc-f229-4573-a67c-44c0d573c336\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.003924 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22wqv\" (UniqueName: \"kubernetes.io/projected/8fe64ebc-f229-4573-a67c-44c0d573c336-kube-api-access-22wqv\") pod \"8fe64ebc-f229-4573-a67c-44c0d573c336\" (UID: \"8fe64ebc-f229-4573-a67c-44c0d573c336\") " Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.030646 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe64ebc-f229-4573-a67c-44c0d573c336-kube-api-access-22wqv" (OuterVolumeSpecName: "kube-api-access-22wqv") pod "8fe64ebc-f229-4573-a67c-44c0d573c336" (UID: "8fe64ebc-f229-4573-a67c-44c0d573c336"). InnerVolumeSpecName "kube-api-access-22wqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.080689 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe64ebc-f229-4573-a67c-44c0d573c336" (UID: "8fe64ebc-f229-4573-a67c-44c0d573c336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.101622 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-config" (OuterVolumeSpecName: "config") pod "8fe64ebc-f229-4573-a67c-44c0d573c336" (UID: "8fe64ebc-f229-4573-a67c-44c0d573c336"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.139507 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22wqv\" (UniqueName: \"kubernetes.io/projected/8fe64ebc-f229-4573-a67c-44c0d573c336-kube-api-access-22wqv\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.139540 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.139553 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe64ebc-f229-4573-a67c-44c0d573c336-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.453537 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.508834 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-zkqh2"] Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.509074 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="dnsmasq-dns" containerID="cri-o://6e2c3827a8918513321c7ce8e2aafc90f1e5c3061537f80389aa59b275166990" gracePeriod=10 Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.683995 4735 generic.go:334] "Generic (PLEG): container finished" podID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerID="6e2c3827a8918513321c7ce8e2aafc90f1e5c3061537f80389aa59b275166990" exitCode=0 Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.684052 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" event={"ID":"b5564411-c04a-4da8-a5c9-c38815d6aa2a","Type":"ContainerDied","Data":"6e2c3827a8918513321c7ce8e2aafc90f1e5c3061537f80389aa59b275166990"} Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.686919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d25cz" event={"ID":"8fe64ebc-f229-4573-a67c-44c0d573c336","Type":"ContainerDied","Data":"e0e3b7e4addda32f7902e1bfbedd0797d23b3bc63f43104df6d9b33448f04ce2"} Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.686965 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0e3b7e4addda32f7902e1bfbedd0797d23b3bc63f43104df6d9b33448f04ce2" Oct 08 13:15:53 crc kubenswrapper[4735]: I1008 13:15:53.687029 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d25cz" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.097098 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b55c5465-m72x9"] Oct 08 13:15:54 crc kubenswrapper[4735]: E1008 13:15:54.097949 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe64ebc-f229-4573-a67c-44c0d573c336" containerName="neutron-db-sync" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.097969 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe64ebc-f229-4573-a67c-44c0d573c336" containerName="neutron-db-sync" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.098210 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe64ebc-f229-4573-a67c-44c0d573c336" containerName="neutron-db-sync" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.099379 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.113948 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b55c5465-m72x9"] Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.194233 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-698444467b-bvdzb"] Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.196252 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.202494 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.204984 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.205079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v64wm" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.205545 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.209301 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-698444467b-bvdzb"] Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.268576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfc2\" (UniqueName: \"kubernetes.io/projected/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-kube-api-access-nwfc2\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.268721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-sb\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.268778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-config\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.268801 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-svc\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.268937 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8xl\" (UniqueName: \"kubernetes.io/projected/daeb6ee4-aafc-4251-9512-43c8cde46c02-kube-api-access-jz8xl\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.269026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-nb\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.269099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-swift-storage-0\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.269177 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-ovndb-tls-certs\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.269201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-config\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.269231 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-httpd-config\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.269270 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-combined-ca-bundle\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.370932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-ovndb-tls-certs\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.370998 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-config\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-httpd-config\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371106 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-combined-ca-bundle\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371151 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfc2\" (UniqueName: \"kubernetes.io/projected/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-kube-api-access-nwfc2\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-sb\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-config\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-svc\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371335 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8xl\" (UniqueName: \"kubernetes.io/projected/daeb6ee4-aafc-4251-9512-43c8cde46c02-kube-api-access-jz8xl\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-nb\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.371409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-swift-storage-0\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.372574 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-swift-storage-0\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.372602 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-svc\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.372746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-nb\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.372769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-config\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.373617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-sb\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.378349 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-ovndb-tls-certs\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.391833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-combined-ca-bundle\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.396049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8xl\" (UniqueName: \"kubernetes.io/projected/daeb6ee4-aafc-4251-9512-43c8cde46c02-kube-api-access-jz8xl\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.396173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-config\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.396419 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-httpd-config\") pod \"neutron-698444467b-bvdzb\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.400382 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfc2\" (UniqueName: \"kubernetes.io/projected/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-kube-api-access-nwfc2\") pod \"dnsmasq-dns-67b55c5465-m72x9\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.429586 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.524109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.878296 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:15:54 crc kubenswrapper[4735]: I1008 13:15:54.878359 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:15:56 crc kubenswrapper[4735]: I1008 13:15:56.913630 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5574db6b6f-94l7x"] Oct 08 13:15:56 crc kubenswrapper[4735]: I1008 13:15:56.915324 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:56 crc kubenswrapper[4735]: I1008 13:15:56.918664 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 08 13:15:56 crc kubenswrapper[4735]: I1008 13:15:56.919054 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 08 13:15:56 crc kubenswrapper[4735]: I1008 13:15:56.924649 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5574db6b6f-94l7x"] Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022178 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-httpd-config\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022299 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-config\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-public-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022356 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjfqt\" (UniqueName: \"kubernetes.io/projected/74ecad81-3dfd-4f36-be79-699b6bd30e90-kube-api-access-rjfqt\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022384 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-combined-ca-bundle\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-internal-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.022453 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.123695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjfqt\" (UniqueName: \"kubernetes.io/projected/74ecad81-3dfd-4f36-be79-699b6bd30e90-kube-api-access-rjfqt\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.124236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-combined-ca-bundle\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.124265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-internal-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.124289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.124390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-httpd-config\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.124425 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-config\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.124445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-public-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.133539 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-combined-ca-bundle\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.133597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-config\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.133681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.134001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-internal-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.134167 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-httpd-config\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.148093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-public-tls-certs\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.154010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjfqt\" (UniqueName: \"kubernetes.io/projected/74ecad81-3dfd-4f36-be79-699b6bd30e90-kube-api-access-rjfqt\") pod \"neutron-5574db6b6f-94l7x\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:57 crc kubenswrapper[4735]: I1008 13:15:57.246957 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:15:58 crc kubenswrapper[4735]: I1008 13:15:58.292135 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.083949 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.084516 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dx6hf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(328b1aa0-7e48-4aa0-b456-e1beefb886c0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.085991 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.197880 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.203440 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.346895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-httpd-run\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.346946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-httpd-run\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.346987 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-combined-ca-bundle\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347020 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347039 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347066 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-config-data\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-logs\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-combined-ca-bundle\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347221 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-562l7\" (UniqueName: \"kubernetes.io/projected/ed035481-64c3-43d2-9b59-ca6ac63225cb-kube-api-access-562l7\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347272 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsszl\" (UniqueName: \"kubernetes.io/projected/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-kube-api-access-fsszl\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-config-data\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347336 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-scripts\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-logs\") pod \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\" (UID: \"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347426 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347454 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-scripts\") pod \"ed035481-64c3-43d2-9b59-ca6ac63225cb\" (UID: \"ed035481-64c3-43d2-9b59-ca6ac63225cb\") " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347460 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347901 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.347926 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.348663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-logs" (OuterVolumeSpecName: "logs") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.349512 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-logs" (OuterVolumeSpecName: "logs") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.352400 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-scripts" (OuterVolumeSpecName: "scripts") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.352855 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.353557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-scripts" (OuterVolumeSpecName: "scripts") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.353717 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-kube-api-access-fsszl" (OuterVolumeSpecName: "kube-api-access-fsszl") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "kube-api-access-fsszl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.359238 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.373892 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed035481-64c3-43d2-9b59-ca6ac63225cb-kube-api-access-562l7" (OuterVolumeSpecName: "kube-api-access-562l7") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "kube-api-access-562l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.397680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.400579 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.428844 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-config-data" (OuterVolumeSpecName: "config-data") pod "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" (UID: "9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.436141 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-config-data" (OuterVolumeSpecName: "config-data") pod "ed035481-64c3-43d2-9b59-ca6ac63225cb" (UID: "ed035481-64c3-43d2-9b59-ca6ac63225cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449200 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-562l7\" (UniqueName: \"kubernetes.io/projected/ed035481-64c3-43d2-9b59-ca6ac63225cb-kube-api-access-562l7\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449241 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsszl\" (UniqueName: \"kubernetes.io/projected/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-kube-api-access-fsszl\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449257 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449270 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449282 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449294 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449304 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449341 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449359 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449372 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449383 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed035481-64c3-43d2-9b59-ca6ac63225cb-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.449393 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed035481-64c3-43d2-9b59-ca6ac63225cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.468084 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.470515 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.550838 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.550875 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.761064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d","Type":"ContainerDied","Data":"229f98a08dc34641a66dbc3886fce4429e2cd06f0f13162a9d479d9b06d6cebf"} Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.761097 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.761151 4735 scope.go:117] "RemoveContainer" containerID="d6ce858accfe43225f333527fea19184da4919604a0bee88ecb0d7209a53fefe" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.764685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed035481-64c3-43d2-9b59-ca6ac63225cb","Type":"ContainerDied","Data":"454086dcbbeeb9452aac056649cfeaa9c93e1cc5c2fbeb2e40a37164030f1e9e"} Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.764726 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.764821 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-central-agent" containerID="cri-o://69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84" gracePeriod=30 Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.764860 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="sg-core" containerID="cri-o://d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b" gracePeriod=30 Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.764862 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-notification-agent" containerID="cri-o://fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab" gracePeriod=30 Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.820173 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.831587 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.842722 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.853278 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.865371 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.865842 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-httpd" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.865865 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-httpd" Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.865886 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-log" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.865894 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-log" Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.865982 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-httpd" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.865991 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-httpd" Oct 08 13:16:02 crc kubenswrapper[4735]: E1008 13:16:02.866008 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-log" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.866016 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-log" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.866252 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-log" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.866282 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-log" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.866293 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" containerName="glance-httpd" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.866311 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" containerName="glance-httpd" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.882330 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.886891 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.886891 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.887219 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-76gcn" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.886943 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.891985 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.956752 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d" path="/var/lib/kubelet/pods/9e3d9e4f-f6a7-4ab9-922c-0ca10c21284d/volumes" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.957545 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed035481-64c3-43d2-9b59-ca6ac63225cb" path="/var/lib/kubelet/pods/ed035481-64c3-43d2-9b59-ca6ac63225cb/volumes" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.957839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqmw\" (UniqueName: \"kubernetes.io/projected/2ee831ac-1cdb-4271-b7a3-054313f1d20b-kube-api-access-nlqmw\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.957908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.957938 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.957964 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.958011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.958057 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.958103 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.958155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.959105 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.961178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.963575 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.963871 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 13:16:02 crc kubenswrapper[4735]: I1008 13:16:02.970014 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7ws\" (UniqueName: \"kubernetes.io/projected/af8d6fa0-e75f-40fe-92e9-284df31eae95-kube-api-access-9f7ws\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060255 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060288 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlqmw\" (UniqueName: \"kubernetes.io/projected/2ee831ac-1cdb-4271-b7a3-054313f1d20b-kube-api-access-nlqmw\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-config-data\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060749 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060771 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.060899 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-scripts\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.061070 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.061259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.061305 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-logs\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.061348 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.061792 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.062394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.065266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.065981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.066525 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.066818 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.080286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlqmw\" (UniqueName: \"kubernetes.io/projected/2ee831ac-1cdb-4271-b7a3-054313f1d20b-kube-api-access-nlqmw\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.101726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163606 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-config-data\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163675 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-scripts\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-logs\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163784 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7ws\" (UniqueName: \"kubernetes.io/projected/af8d6fa0-e75f-40fe-92e9-284df31eae95-kube-api-access-9f7ws\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163804 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.163853 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.164149 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.164356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-logs\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.166380 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.168805 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.169178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-config-data\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.169523 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-scripts\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.181330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.184394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7ws\" (UniqueName: \"kubernetes.io/projected/af8d6fa0-e75f-40fe-92e9-284df31eae95-kube-api-access-9f7ws\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.199521 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.225545 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.284133 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:03 crc kubenswrapper[4735]: E1008 13:16:03.564866 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 08 13:16:03 crc kubenswrapper[4735]: E1008 13:16:03.565169 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6mvxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nplcf_openstack(2dd43efe-433d-4590-bcf9-3304cd6e32fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 13:16:03 crc kubenswrapper[4735]: E1008 13:16:03.566746 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nplcf" podUID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.616016 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.643778 4735 scope.go:117] "RemoveContainer" containerID="929ea1e86d9777b07fcc6f4ac8dfd86bc08e1d420de2d79e937417ebc392a88f" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.673367 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-swift-storage-0\") pod \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.673414 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-svc\") pod \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.673462 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-nb\") pod \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.673519 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l5rs\" (UniqueName: \"kubernetes.io/projected/b5564411-c04a-4da8-a5c9-c38815d6aa2a-kube-api-access-8l5rs\") pod \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.673685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-config\") pod \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.673725 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-sb\") pod \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\" (UID: \"b5564411-c04a-4da8-a5c9-c38815d6aa2a\") " Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.693679 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5564411-c04a-4da8-a5c9-c38815d6aa2a-kube-api-access-8l5rs" (OuterVolumeSpecName: "kube-api-access-8l5rs") pod "b5564411-c04a-4da8-a5c9-c38815d6aa2a" (UID: "b5564411-c04a-4da8-a5c9-c38815d6aa2a"). InnerVolumeSpecName "kube-api-access-8l5rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.697337 4735 scope.go:117] "RemoveContainer" containerID="27689a69ad6db802af2e5dda026f9e8ee8a19078825d065876abe75532b99058" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.739996 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-config" (OuterVolumeSpecName: "config") pod "b5564411-c04a-4da8-a5c9-c38815d6aa2a" (UID: "b5564411-c04a-4da8-a5c9-c38815d6aa2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.751584 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5564411-c04a-4da8-a5c9-c38815d6aa2a" (UID: "b5564411-c04a-4da8-a5c9-c38815d6aa2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.776320 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.776353 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.776365 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l5rs\" (UniqueName: \"kubernetes.io/projected/b5564411-c04a-4da8-a5c9-c38815d6aa2a-kube-api-access-8l5rs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.781485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5564411-c04a-4da8-a5c9-c38815d6aa2a" (UID: "b5564411-c04a-4da8-a5c9-c38815d6aa2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.801438 4735 generic.go:334] "Generic (PLEG): container finished" podID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerID="d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b" exitCode=2 Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.801512 4735 generic.go:334] "Generic (PLEG): container finished" podID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerID="69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84" exitCode=0 Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.801563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerDied","Data":"d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b"} Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.801597 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerDied","Data":"69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84"} Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.802635 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5564411-c04a-4da8-a5c9-c38815d6aa2a" (UID: "b5564411-c04a-4da8-a5c9-c38815d6aa2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.806129 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.806145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" event={"ID":"b5564411-c04a-4da8-a5c9-c38815d6aa2a","Type":"ContainerDied","Data":"2a9229a37653c84063c02f1db9471b68ec39bec63bd6498a655b3e280ed6e9a7"} Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.810616 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5564411-c04a-4da8-a5c9-c38815d6aa2a" (UID: "b5564411-c04a-4da8-a5c9-c38815d6aa2a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.818848 4735 scope.go:117] "RemoveContainer" containerID="4a282d2d7ef52bfbc5340fdb20f7f9aa86f104a1a4a7d26b15f4b3061e87eed4" Oct 08 13:16:03 crc kubenswrapper[4735]: E1008 13:16:03.831685 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-nplcf" podUID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.867594 4735 scope.go:117] "RemoveContainer" containerID="6e2c3827a8918513321c7ce8e2aafc90f1e5c3061537f80389aa59b275166990" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.880670 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.880696 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.880706 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5564411-c04a-4da8-a5c9-c38815d6aa2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:03 crc kubenswrapper[4735]: I1008 13:16:03.900842 4735 scope.go:117] "RemoveContainer" containerID="5a7fce3cae6877cf55183bdd9bfde95b53f673308a2059a10a5ec3c9c39f7bcd" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.168067 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-zkqh2"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.177645 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-zkqh2"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.323027 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.390853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-config-data\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.390934 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-log-httpd\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.391164 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-combined-ca-bundle\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.391242 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-run-httpd\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.391314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx6hf\" (UniqueName: \"kubernetes.io/projected/328b1aa0-7e48-4aa0-b456-e1beefb886c0-kube-api-access-dx6hf\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.391356 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-scripts\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.391406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-sg-core-conf-yaml\") pod \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\" (UID: \"328b1aa0-7e48-4aa0-b456-e1beefb886c0\") " Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.391625 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.392044 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.397653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.397827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328b1aa0-7e48-4aa0-b456-e1beefb886c0-kube-api-access-dx6hf" (OuterVolumeSpecName: "kube-api-access-dx6hf") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "kube-api-access-dx6hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.401777 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-scripts" (OuterVolumeSpecName: "scripts") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.427381 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b55c5465-m72x9"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.431413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.475179 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.476527 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-config-data" (OuterVolumeSpecName: "config-data") pod "328b1aa0-7e48-4aa0-b456-e1beefb886c0" (UID: "328b1aa0-7e48-4aa0-b456-e1beefb886c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.493433 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.493470 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/328b1aa0-7e48-4aa0-b456-e1beefb886c0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.493501 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.493517 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx6hf\" (UniqueName: \"kubernetes.io/projected/328b1aa0-7e48-4aa0-b456-e1beefb886c0-kube-api-access-dx6hf\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.493529 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.493541 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/328b1aa0-7e48-4aa0-b456-e1beefb886c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.499372 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-698444467b-bvdzb"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.595977 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:04 crc kubenswrapper[4735]: W1008 13:16:04.599164 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8d6fa0_e75f_40fe_92e9_284df31eae95.slice/crio-6bfb11afcb1a3a7eedc99e0e4bd2b112e915ca0b0433433cb81444f9ad9a1290 WatchSource:0}: Error finding container 6bfb11afcb1a3a7eedc99e0e4bd2b112e915ca0b0433433cb81444f9ad9a1290: Status 404 returned error can't find the container with id 6bfb11afcb1a3a7eedc99e0e4bd2b112e915ca0b0433433cb81444f9ad9a1290 Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.684650 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.839677 4735 generic.go:334] "Generic (PLEG): container finished" podID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerID="fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab" exitCode=0 Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.839738 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerDied","Data":"fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.839768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"328b1aa0-7e48-4aa0-b456-e1beefb886c0","Type":"ContainerDied","Data":"4c1144ef5da3c6ecf4fbad63c94e2fc020cf67499ac798af52f7307bc75bb778"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.839787 4735 scope.go:117] "RemoveContainer" containerID="d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.839880 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.846567 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af8d6fa0-e75f-40fe-92e9-284df31eae95","Type":"ContainerStarted","Data":"6bfb11afcb1a3a7eedc99e0e4bd2b112e915ca0b0433433cb81444f9ad9a1290"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.851204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdrp4" event={"ID":"e0c034b6-95c9-42d6-8490-0d759b602d82","Type":"ContainerStarted","Data":"d9517d4b5ff0275fae51dae0a8e59801f758f590db88101f461635ef9b151716"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.857817 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee831ac-1cdb-4271-b7a3-054313f1d20b","Type":"ContainerStarted","Data":"e0d263fc6c060d2f656a4bc3db832b593de8e7b3f636cea6a05c47b50de3814f"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.863827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-698444467b-bvdzb" event={"ID":"daeb6ee4-aafc-4251-9512-43c8cde46c02","Type":"ContainerStarted","Data":"86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.863890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-698444467b-bvdzb" event={"ID":"daeb6ee4-aafc-4251-9512-43c8cde46c02","Type":"ContainerStarted","Data":"35833a782344be4576d1fad4630969d8b73b2598b4a07567ddcb025163cab13e"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.878548 4735 generic.go:334] "Generic (PLEG): container finished" podID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerID="5ab9ec518954edab9f6272ed8666c548dfe17bfe86b964287bfe210eb7bb6df3" exitCode=0 Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.878664 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" event={"ID":"aadb7bac-54f6-4e90-a8e8-88e45ca68a04","Type":"ContainerDied","Data":"5ab9ec518954edab9f6272ed8666c548dfe17bfe86b964287bfe210eb7bb6df3"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.878696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" event={"ID":"aadb7bac-54f6-4e90-a8e8-88e45ca68a04","Type":"ContainerStarted","Data":"55a9b7de4e65576291708bddabcb9fcdbb47c61aa2bdc145d1b3b157fc57d554"} Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.879652 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cdrp4" podStartSLOduration=7.044056532 podStartE2EDuration="26.879636277s" podCreationTimestamp="2025-10-08 13:15:38 +0000 UTC" firstStartedPulling="2025-10-08 13:15:42.248062925 +0000 UTC m=+1192.056367469" lastFinishedPulling="2025-10-08 13:16:02.08364266 +0000 UTC m=+1211.891947214" observedRunningTime="2025-10-08 13:16:04.875972252 +0000 UTC m=+1214.684276806" watchObservedRunningTime="2025-10-08 13:16:04.879636277 +0000 UTC m=+1214.687940821" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.899355 4735 scope.go:117] "RemoveContainer" containerID="fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.926525 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" path="/var/lib/kubelet/pods/b5564411-c04a-4da8-a5c9-c38815d6aa2a/volumes" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.930777 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.955040 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.960912 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:04 crc kubenswrapper[4735]: E1008 13:16:04.961332 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-central-agent" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961351 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-central-agent" Oct 08 13:16:04 crc kubenswrapper[4735]: E1008 13:16:04.961380 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="init" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961386 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="init" Oct 08 13:16:04 crc kubenswrapper[4735]: E1008 13:16:04.961401 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="dnsmasq-dns" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961407 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="dnsmasq-dns" Oct 08 13:16:04 crc kubenswrapper[4735]: E1008 13:16:04.961417 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="sg-core" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961423 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="sg-core" Oct 08 13:16:04 crc kubenswrapper[4735]: E1008 13:16:04.961430 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-notification-agent" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961436 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-notification-agent" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961634 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="dnsmasq-dns" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961649 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="sg-core" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961656 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-notification-agent" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.961668 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" containerName="ceilometer-central-agent" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.963418 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.963793 4735 scope.go:117] "RemoveContainer" containerID="69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.966308 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.966936 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 13:16:04 crc kubenswrapper[4735]: I1008 13:16:04.976993 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-scripts\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023561 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-config-data\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023612 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023660 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-log-httpd\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-run-httpd\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.023837 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gwp6\" (UniqueName: \"kubernetes.io/projected/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-kube-api-access-5gwp6\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.041494 4735 scope.go:117] "RemoveContainer" containerID="d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b" Oct 08 13:16:05 crc kubenswrapper[4735]: E1008 13:16:05.043845 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b\": container with ID starting with d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b not found: ID does not exist" containerID="d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.043891 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b"} err="failed to get container status \"d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b\": rpc error: code = NotFound desc = could not find container \"d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b\": container with ID starting with d4d4c45fc2bb4c55e870e6a341ec3bfb743e73b6533a1f68df2d34f7ccb1516b not found: ID does not exist" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.043922 4735 scope.go:117] "RemoveContainer" containerID="fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab" Oct 08 13:16:05 crc kubenswrapper[4735]: E1008 13:16:05.044524 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab\": container with ID starting with fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab not found: ID does not exist" containerID="fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.044545 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab"} err="failed to get container status \"fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab\": rpc error: code = NotFound desc = could not find container \"fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab\": container with ID starting with fd3f1e1a91c4a5ce714042c3fb2b942b6db4ce9bbbba4c7bb7f373507469b9ab not found: ID does not exist" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.044557 4735 scope.go:117] "RemoveContainer" containerID="69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84" Oct 08 13:16:05 crc kubenswrapper[4735]: E1008 13:16:05.045153 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84\": container with ID starting with 69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84 not found: ID does not exist" containerID="69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.045175 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84"} err="failed to get container status \"69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84\": rpc error: code = NotFound desc = could not find container \"69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84\": container with ID starting with 69f2b92d74478d36b5fa99d63766ae94ce188248b20f46c69cafe2ab02838b84 not found: ID does not exist" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125487 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gwp6\" (UniqueName: \"kubernetes.io/projected/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-kube-api-access-5gwp6\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-scripts\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125575 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-config-data\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-log-httpd\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125741 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.125769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-run-httpd\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.126409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-run-httpd\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.126674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-log-httpd\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.130178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-config-data\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.132133 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.132874 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.133927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-scripts\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.148325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gwp6\" (UniqueName: \"kubernetes.io/projected/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-kube-api-access-5gwp6\") pod \"ceilometer-0\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.298551 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.361089 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5574db6b6f-94l7x"] Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.844582 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:05 crc kubenswrapper[4735]: W1008 13:16:05.854860 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee46d28d_61ac_41d0_95da_9a66bf7c4ed7.slice/crio-91c3850af88b3796508c3f24ce83d82adaf9553bb6e8ec39c3b8f7a46acdd12a WatchSource:0}: Error finding container 91c3850af88b3796508c3f24ce83d82adaf9553bb6e8ec39c3b8f7a46acdd12a: Status 404 returned error can't find the container with id 91c3850af88b3796508c3f24ce83d82adaf9553bb6e8ec39c3b8f7a46acdd12a Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.898676 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-698444467b-bvdzb" event={"ID":"daeb6ee4-aafc-4251-9512-43c8cde46c02","Type":"ContainerStarted","Data":"0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0"} Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.898906 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.902061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" event={"ID":"aadb7bac-54f6-4e90-a8e8-88e45ca68a04","Type":"ContainerStarted","Data":"55a78dcda62e87186336df3a82eeb49e1ae7c759d1eb5940fda668f3fe9a214d"} Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.902414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.904740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerStarted","Data":"91c3850af88b3796508c3f24ce83d82adaf9553bb6e8ec39c3b8f7a46acdd12a"} Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.911042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af8d6fa0-e75f-40fe-92e9-284df31eae95","Type":"ContainerStarted","Data":"4cdd9cd0992efec7fcd98ab9b6975f92c9411e342b5181c59ac77e76ba43b22d"} Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.912846 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574db6b6f-94l7x" event={"ID":"74ecad81-3dfd-4f36-be79-699b6bd30e90","Type":"ContainerStarted","Data":"3ab3e7a6b765a6de193c1b1dea7e1331e5f6be760afeb37f0a1e52c4bc595d44"} Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.916137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee831ac-1cdb-4271-b7a3-054313f1d20b","Type":"ContainerStarted","Data":"5bf07e71f83ec42daf31838c9792cb61bff2c3545d23d5b759abc92cdc49cd03"} Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.930419 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-698444467b-bvdzb" podStartSLOduration=11.930391914 podStartE2EDuration="11.930391914s" podCreationTimestamp="2025-10-08 13:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:05.920795657 +0000 UTC m=+1215.729100231" watchObservedRunningTime="2025-10-08 13:16:05.930391914 +0000 UTC m=+1215.738696458" Oct 08 13:16:05 crc kubenswrapper[4735]: I1008 13:16:05.961845 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" podStartSLOduration=11.961827163 podStartE2EDuration="11.961827163s" podCreationTimestamp="2025-10-08 13:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:05.960179221 +0000 UTC m=+1215.768483795" watchObservedRunningTime="2025-10-08 13:16:05.961827163 +0000 UTC m=+1215.770131707" Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.906508 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328b1aa0-7e48-4aa0-b456-e1beefb886c0" path="/var/lib/kubelet/pods/328b1aa0-7e48-4aa0-b456-e1beefb886c0/volumes" Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.927454 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574db6b6f-94l7x" event={"ID":"74ecad81-3dfd-4f36-be79-699b6bd30e90","Type":"ContainerStarted","Data":"640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543"} Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.927542 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574db6b6f-94l7x" event={"ID":"74ecad81-3dfd-4f36-be79-699b6bd30e90","Type":"ContainerStarted","Data":"b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d"} Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.927589 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.929760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee831ac-1cdb-4271-b7a3-054313f1d20b","Type":"ContainerStarted","Data":"195a6ffbbaaa312e7578490fa0e19d1ab01f295407b49eb14d400842de28a9ad"} Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.931335 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerStarted","Data":"22c487685c2fc6697154b1f57342d68c199c65dfda42b1c9b316dbf6547b7d38"} Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.933053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af8d6fa0-e75f-40fe-92e9-284df31eae95","Type":"ContainerStarted","Data":"1d0e5a0f7e8bf01a768ba2a2637fc04c6305e74a200d36d9b6c9989a59e6e6ed"} Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.957674 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5574db6b6f-94l7x" podStartSLOduration=10.957651826 podStartE2EDuration="10.957651826s" podCreationTimestamp="2025-10-08 13:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:06.953827057 +0000 UTC m=+1216.762131601" watchObservedRunningTime="2025-10-08 13:16:06.957651826 +0000 UTC m=+1216.765956370" Oct 08 13:16:06 crc kubenswrapper[4735]: I1008 13:16:06.997863 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.997836941 podStartE2EDuration="4.997836941s" podCreationTimestamp="2025-10-08 13:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:06.983676506 +0000 UTC m=+1216.791981040" watchObservedRunningTime="2025-10-08 13:16:06.997836941 +0000 UTC m=+1216.806141485" Oct 08 13:16:07 crc kubenswrapper[4735]: I1008 13:16:07.015067 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.015042544 podStartE2EDuration="5.015042544s" podCreationTimestamp="2025-10-08 13:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:07.01217612 +0000 UTC m=+1216.820480674" watchObservedRunningTime="2025-10-08 13:16:07.015042544 +0000 UTC m=+1216.823347088" Oct 08 13:16:07 crc kubenswrapper[4735]: I1008 13:16:07.946772 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0c034b6-95c9-42d6-8490-0d759b602d82" containerID="d9517d4b5ff0275fae51dae0a8e59801f758f590db88101f461635ef9b151716" exitCode=0 Oct 08 13:16:07 crc kubenswrapper[4735]: I1008 13:16:07.946890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdrp4" event={"ID":"e0c034b6-95c9-42d6-8490-0d759b602d82","Type":"ContainerDied","Data":"d9517d4b5ff0275fae51dae0a8e59801f758f590db88101f461635ef9b151716"} Oct 08 13:16:07 crc kubenswrapper[4735]: I1008 13:16:07.949654 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerStarted","Data":"c4630f113a494d80dafdc30d9af150e66a89aa014788f6211dc33731b3b851a2"} Oct 08 13:16:08 crc kubenswrapper[4735]: I1008 13:16:08.291899 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d96c67b5-zkqh2" podUID="b5564411-c04a-4da8-a5c9-c38815d6aa2a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Oct 08 13:16:08 crc kubenswrapper[4735]: I1008 13:16:08.963740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerStarted","Data":"7694171eb43247b8cd15f87f146618756899c9babaf01cc45fc5d4d7c78cea6a"} Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.019963 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.079063 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.411854 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.522673 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b87k8\" (UniqueName: \"kubernetes.io/projected/e0c034b6-95c9-42d6-8490-0d759b602d82-kube-api-access-b87k8\") pod \"e0c034b6-95c9-42d6-8490-0d759b602d82\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.522968 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-combined-ca-bundle\") pod \"e0c034b6-95c9-42d6-8490-0d759b602d82\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.523087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-db-sync-config-data\") pod \"e0c034b6-95c9-42d6-8490-0d759b602d82\" (UID: \"e0c034b6-95c9-42d6-8490-0d759b602d82\") " Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.530855 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c034b6-95c9-42d6-8490-0d759b602d82-kube-api-access-b87k8" (OuterVolumeSpecName: "kube-api-access-b87k8") pod "e0c034b6-95c9-42d6-8490-0d759b602d82" (UID: "e0c034b6-95c9-42d6-8490-0d759b602d82"). InnerVolumeSpecName "kube-api-access-b87k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.536598 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e0c034b6-95c9-42d6-8490-0d759b602d82" (UID: "e0c034b6-95c9-42d6-8490-0d759b602d82"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.553791 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0c034b6-95c9-42d6-8490-0d759b602d82" (UID: "e0c034b6-95c9-42d6-8490-0d759b602d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.625709 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b87k8\" (UniqueName: \"kubernetes.io/projected/e0c034b6-95c9-42d6-8490-0d759b602d82-kube-api-access-b87k8\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.625768 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.625781 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0c034b6-95c9-42d6-8490-0d759b602d82-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.973725 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdrp4" Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.976281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdrp4" event={"ID":"e0c034b6-95c9-42d6-8490-0d759b602d82","Type":"ContainerDied","Data":"3dd7bcb415d4a88aba406ca03ec913ecfaf336f1a109b0f3d2dd3ab35335a9f4"} Oct 08 13:16:09 crc kubenswrapper[4735]: I1008 13:16:09.976313 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd7bcb415d4a88aba406ca03ec913ecfaf336f1a109b0f3d2dd3ab35335a9f4" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.205419 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6cc55c9557-96kwz"] Oct 08 13:16:10 crc kubenswrapper[4735]: E1008 13:16:10.205944 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c034b6-95c9-42d6-8490-0d759b602d82" containerName="barbican-db-sync" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.205969 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c034b6-95c9-42d6-8490-0d759b602d82" containerName="barbican-db-sync" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.206192 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c034b6-95c9-42d6-8490-0d759b602d82" containerName="barbican-db-sync" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.207442 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.217132 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.217413 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.217580 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j2pkq" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.227830 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6cc55c9557-96kwz"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.271694 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-69dd69556d-pvx5r"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.273288 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.278619 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.278986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada58119-3f96-4962-b214-fb2662c3d935-logs\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-combined-ca-bundle\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftvs\" (UniqueName: \"kubernetes.io/projected/ada58119-3f96-4962-b214-fb2662c3d935-kube-api-access-xftvs\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfj9r\" (UniqueName: \"kubernetes.io/projected/9b111975-076c-4c7f-bcbd-ecef0b62936f-kube-api-access-kfj9r\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279205 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b111975-076c-4c7f-bcbd-ecef0b62936f-logs\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data-custom\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-combined-ca-bundle\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.279310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data-custom\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.281040 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.292512 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69dd69556d-pvx5r"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.353586 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b55c5465-m72x9"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.353872 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerName="dnsmasq-dns" containerID="cri-o://55a78dcda62e87186336df3a82eeb49e1ae7c759d1eb5940fda668f3fe9a214d" gracePeriod=10 Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.363665 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada58119-3f96-4962-b214-fb2662c3d935-logs\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-combined-ca-bundle\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftvs\" (UniqueName: \"kubernetes.io/projected/ada58119-3f96-4962-b214-fb2662c3d935-kube-api-access-xftvs\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfj9r\" (UniqueName: \"kubernetes.io/projected/9b111975-076c-4c7f-bcbd-ecef0b62936f-kube-api-access-kfj9r\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384727 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b111975-076c-4c7f-bcbd-ecef0b62936f-logs\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data-custom\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384831 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-combined-ca-bundle\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.384900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data-custom\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.385000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.385074 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.388424 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-zjd4g"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.390256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada58119-3f96-4962-b214-fb2662c3d935-logs\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.390307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b111975-076c-4c7f-bcbd-ecef0b62936f-logs\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.390417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.397445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data-custom\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.399021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-combined-ca-bundle\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.399165 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data-custom\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.405502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.406401 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c7cdb8d88-4pp88"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.408889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-combined-ca-bundle\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.415753 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.418525 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c7cdb8d88-4pp88"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.425664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.425795 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.430056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftvs\" (UniqueName: \"kubernetes.io/projected/ada58119-3f96-4962-b214-fb2662c3d935-kube-api-access-xftvs\") pod \"barbican-worker-6cc55c9557-96kwz\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.431427 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfj9r\" (UniqueName: \"kubernetes.io/projected/9b111975-076c-4c7f-bcbd-ecef0b62936f-kube-api-access-kfj9r\") pod \"barbican-keystone-listener-69dd69556d-pvx5r\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.432787 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-zjd4g"] Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.499022 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-sb\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.499072 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4p22\" (UniqueName: \"kubernetes.io/projected/28f754cf-0740-4da5-bec9-3e238e248795-kube-api-access-g4p22\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.499111 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.499156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-nb\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.499186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-config\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.499225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-swift-storage-0\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602050 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khxfc\" (UniqueName: \"kubernetes.io/projected/4aa1710f-6005-4ce1-901f-d71442736e68-kube-api-access-khxfc\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-nb\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-config\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602498 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-swift-storage-0\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-combined-ca-bundle\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aa1710f-6005-4ce1-901f-d71442736e68-logs\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-sb\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602760 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4p22\" (UniqueName: \"kubernetes.io/projected/28f754cf-0740-4da5-bec9-3e238e248795-kube-api-access-g4p22\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.602791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data-custom\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.607182 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.611268 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.612395 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-config\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.615162 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-swift-storage-0\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.619310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-nb\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.621484 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-sb\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.636935 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.640449 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4p22\" (UniqueName: \"kubernetes.io/projected/28f754cf-0740-4da5-bec9-3e238e248795-kube-api-access-g4p22\") pod \"dnsmasq-dns-5c78787df7-zjd4g\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.704631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khxfc\" (UniqueName: \"kubernetes.io/projected/4aa1710f-6005-4ce1-901f-d71442736e68-kube-api-access-khxfc\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.705053 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.705131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-combined-ca-bundle\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.705165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aa1710f-6005-4ce1-901f-d71442736e68-logs\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.705196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data-custom\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.709995 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aa1710f-6005-4ce1-901f-d71442736e68-logs\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.716371 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-combined-ca-bundle\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.720768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data-custom\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.721759 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.728816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khxfc\" (UniqueName: \"kubernetes.io/projected/4aa1710f-6005-4ce1-901f-d71442736e68-kube-api-access-khxfc\") pod \"barbican-api-7c7cdb8d88-4pp88\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.757056 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:10 crc kubenswrapper[4735]: I1008 13:16:10.767713 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.071819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerStarted","Data":"965f48a9d18436349fa306565e27d670df85717967b4f8a617abda501c600caf"} Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.074538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.083655 4735 generic.go:334] "Generic (PLEG): container finished" podID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerID="55a78dcda62e87186336df3a82eeb49e1ae7c759d1eb5940fda668f3fe9a214d" exitCode=0 Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.083932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" event={"ID":"aadb7bac-54f6-4e90-a8e8-88e45ca68a04","Type":"ContainerDied","Data":"55a78dcda62e87186336df3a82eeb49e1ae7c759d1eb5940fda668f3fe9a214d"} Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.106467 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.865400709 podStartE2EDuration="7.106447036s" podCreationTimestamp="2025-10-08 13:16:04 +0000 UTC" firstStartedPulling="2025-10-08 13:16:05.869228199 +0000 UTC m=+1215.677532743" lastFinishedPulling="2025-10-08 13:16:10.110274526 +0000 UTC m=+1219.918579070" observedRunningTime="2025-10-08 13:16:11.097606879 +0000 UTC m=+1220.905911423" watchObservedRunningTime="2025-10-08 13:16:11.106447036 +0000 UTC m=+1220.914751580" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.107328 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.220994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-nb\") pod \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.221219 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-sb\") pod \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.221259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-config\") pod \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.222145 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-svc\") pod \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.222224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-swift-storage-0\") pod \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.222281 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfc2\" (UniqueName: \"kubernetes.io/projected/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-kube-api-access-nwfc2\") pod \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\" (UID: \"aadb7bac-54f6-4e90-a8e8-88e45ca68a04\") " Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.275024 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-kube-api-access-nwfc2" (OuterVolumeSpecName: "kube-api-access-nwfc2") pod "aadb7bac-54f6-4e90-a8e8-88e45ca68a04" (UID: "aadb7bac-54f6-4e90-a8e8-88e45ca68a04"). InnerVolumeSpecName "kube-api-access-nwfc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.321196 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6cc55c9557-96kwz"] Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.323525 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aadb7bac-54f6-4e90-a8e8-88e45ca68a04" (UID: "aadb7bac-54f6-4e90-a8e8-88e45ca68a04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.337465 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfc2\" (UniqueName: \"kubernetes.io/projected/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-kube-api-access-nwfc2\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.337525 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.339340 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aadb7bac-54f6-4e90-a8e8-88e45ca68a04" (UID: "aadb7bac-54f6-4e90-a8e8-88e45ca68a04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.348222 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aadb7bac-54f6-4e90-a8e8-88e45ca68a04" (UID: "aadb7bac-54f6-4e90-a8e8-88e45ca68a04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.357111 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aadb7bac-54f6-4e90-a8e8-88e45ca68a04" (UID: "aadb7bac-54f6-4e90-a8e8-88e45ca68a04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.370118 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-config" (OuterVolumeSpecName: "config") pod "aadb7bac-54f6-4e90-a8e8-88e45ca68a04" (UID: "aadb7bac-54f6-4e90-a8e8-88e45ca68a04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.439451 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.439494 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.439538 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.439551 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aadb7bac-54f6-4e90-a8e8-88e45ca68a04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:11 crc kubenswrapper[4735]: W1008 13:16:11.443070 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b111975_076c_4c7f_bcbd_ecef0b62936f.slice/crio-f112d9a58052e441123ebba81af5d1f30ab0a8da3f60511d56153347d6a0720c WatchSource:0}: Error finding container f112d9a58052e441123ebba81af5d1f30ab0a8da3f60511d56153347d6a0720c: Status 404 returned error can't find the container with id f112d9a58052e441123ebba81af5d1f30ab0a8da3f60511d56153347d6a0720c Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.448569 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69dd69556d-pvx5r"] Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.459289 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-zjd4g"] Oct 08 13:16:11 crc kubenswrapper[4735]: I1008 13:16:11.571904 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c7cdb8d88-4pp88"] Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.095699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cc55c9557-96kwz" event={"ID":"ada58119-3f96-4962-b214-fb2662c3d935","Type":"ContainerStarted","Data":"71f900631d93d0716e11248ae62d2fde6fcc02fcdbcce25da12c4a4c029488c5"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.101373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c7cdb8d88-4pp88" event={"ID":"4aa1710f-6005-4ce1-901f-d71442736e68","Type":"ContainerStarted","Data":"847162f47f9b5a78983f8926a56887b9861cbdfc9236643123b3318b3f0e9f0a"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.101411 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c7cdb8d88-4pp88" event={"ID":"4aa1710f-6005-4ce1-901f-d71442736e68","Type":"ContainerStarted","Data":"95c4126a635fa6e5ab4529c2917a5b400b1d70f3715113ca49b64185a64f61d2"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.101543 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.101826 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.104933 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" event={"ID":"9b111975-076c-4c7f-bcbd-ecef0b62936f","Type":"ContainerStarted","Data":"f112d9a58052e441123ebba81af5d1f30ab0a8da3f60511d56153347d6a0720c"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.123772 4735 generic.go:334] "Generic (PLEG): container finished" podID="28f754cf-0740-4da5-bec9-3e238e248795" containerID="f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8" exitCode=0 Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.123839 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" event={"ID":"28f754cf-0740-4da5-bec9-3e238e248795","Type":"ContainerDied","Data":"f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.123903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" event={"ID":"28f754cf-0740-4da5-bec9-3e238e248795","Type":"ContainerStarted","Data":"b59b543267d73d9e9ff69aecdae0c761eaefee8c907420700bd5324b04dc9e55"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.126719 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c7cdb8d88-4pp88" podStartSLOduration=2.126697238 podStartE2EDuration="2.126697238s" podCreationTimestamp="2025-10-08 13:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:12.122653814 +0000 UTC m=+1221.930958358" watchObservedRunningTime="2025-10-08 13:16:12.126697238 +0000 UTC m=+1221.935001792" Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.128656 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.128773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b55c5465-m72x9" event={"ID":"aadb7bac-54f6-4e90-a8e8-88e45ca68a04","Type":"ContainerDied","Data":"55a9b7de4e65576291708bddabcb9fcdbb47c61aa2bdc145d1b3b157fc57d554"} Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.128832 4735 scope.go:117] "RemoveContainer" containerID="55a78dcda62e87186336df3a82eeb49e1ae7c759d1eb5940fda668f3fe9a214d" Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.170984 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b55c5465-m72x9"] Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.181869 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b55c5465-m72x9"] Oct 08 13:16:12 crc kubenswrapper[4735]: I1008 13:16:12.906704 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" path="/var/lib/kubelet/pods/aadb7bac-54f6-4e90-a8e8-88e45ca68a04/volumes" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.139127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c7cdb8d88-4pp88" event={"ID":"4aa1710f-6005-4ce1-901f-d71442736e68","Type":"ContainerStarted","Data":"33f936d2332ce1aa95d93d854bd325a0ca6212966a7597aabdad32083ac2b2dd"} Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.227065 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.227132 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.262228 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.269872 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.285229 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.285289 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.341194 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.348261 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.820414 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dd5468f46-ccf7d"] Oct 08 13:16:13 crc kubenswrapper[4735]: E1008 13:16:13.821187 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerName="dnsmasq-dns" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.821209 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerName="dnsmasq-dns" Oct 08 13:16:13 crc kubenswrapper[4735]: E1008 13:16:13.821223 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerName="init" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.821230 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerName="init" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.821463 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadb7bac-54f6-4e90-a8e8-88e45ca68a04" containerName="dnsmasq-dns" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.822500 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.833556 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.833867 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.834774 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dd5468f46-ccf7d"] Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.991510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.991580 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data-custom\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.991610 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7jr6\" (UniqueName: \"kubernetes.io/projected/ca78485a-0626-420e-aa59-a9a4680df83c-kube-api-access-k7jr6\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.991910 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca78485a-0626-420e-aa59-a9a4680df83c-logs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.992222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-internal-tls-certs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.992395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-combined-ca-bundle\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:13 crc kubenswrapper[4735]: I1008 13:16:13.992460 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-public-tls-certs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-combined-ca-bundle\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-public-tls-certs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data-custom\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094251 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7jr6\" (UniqueName: \"kubernetes.io/projected/ca78485a-0626-420e-aa59-a9a4680df83c-kube-api-access-k7jr6\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca78485a-0626-420e-aa59-a9a4680df83c-logs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca78485a-0626-420e-aa59-a9a4680df83c-logs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.094912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-internal-tls-certs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.100977 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.103868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-public-tls-certs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.105117 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data-custom\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.108306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-combined-ca-bundle\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.112659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-internal-tls-certs\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.113005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7jr6\" (UniqueName: \"kubernetes.io/projected/ca78485a-0626-420e-aa59-a9a4680df83c-kube-api-access-k7jr6\") pod \"barbican-api-6dd5468f46-ccf7d\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.146716 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.147970 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.147999 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.148009 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:14 crc kubenswrapper[4735]: I1008 13:16:14.148037 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 13:16:15 crc kubenswrapper[4735]: I1008 13:16:15.213055 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:16:15 crc kubenswrapper[4735]: I1008 13:16:15.876776 4735 scope.go:117] "RemoveContainer" containerID="5ab9ec518954edab9f6272ed8666c548dfe17bfe86b964287bfe210eb7bb6df3" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.182799 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.183112 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.265461 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dd5468f46-ccf7d"] Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.635103 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.640267 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.640365 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.644075 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:16 crc kubenswrapper[4735]: I1008 13:16:16.781776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.195910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" event={"ID":"28f754cf-0740-4da5-bec9-3e238e248795","Type":"ContainerStarted","Data":"028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.198642 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.200810 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd5468f46-ccf7d" event={"ID":"ca78485a-0626-420e-aa59-a9a4680df83c","Type":"ContainerStarted","Data":"54ae09e0758dbced8f136de91ec860924f9d0a4a1aa8e35c4a62a56d25f7b734"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.201135 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd5468f46-ccf7d" event={"ID":"ca78485a-0626-420e-aa59-a9a4680df83c","Type":"ContainerStarted","Data":"719ed4d6e6ca3be4b585ba42337fe3c6bf926775f7cb265506efe244531bac65"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.201150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd5468f46-ccf7d" event={"ID":"ca78485a-0626-420e-aa59-a9a4680df83c","Type":"ContainerStarted","Data":"d1a52799ef805cf10d4e03ac9720bbab4f9edf303ad6ce305398173baa3512d9"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.202041 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.202077 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.206125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cc55c9557-96kwz" event={"ID":"ada58119-3f96-4962-b214-fb2662c3d935","Type":"ContainerStarted","Data":"6d64df6cc879792395fce1a8a219ef400d1f7cfe418d3ee5312fec24e31fc592"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.206218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cc55c9557-96kwz" event={"ID":"ada58119-3f96-4962-b214-fb2662c3d935","Type":"ContainerStarted","Data":"7d7c87d703a723cb263ee5030a259e665d68c85efbd777b9a5b704b5476ab749"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.208590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" event={"ID":"9b111975-076c-4c7f-bcbd-ecef0b62936f","Type":"ContainerStarted","Data":"9f5626323fe1058e9c1fda7509e11efd3640670cee71f3ea2717d062bae876cf"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.208660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" event={"ID":"9b111975-076c-4c7f-bcbd-ecef0b62936f","Type":"ContainerStarted","Data":"d5b88355cbab2ef2c4641f67a06c04f010a0ffe1ebf2da8598d0977f7187066a"} Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.237838 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" podStartSLOduration=7.237816229 podStartE2EDuration="7.237816229s" podCreationTimestamp="2025-10-08 13:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:17.229026003 +0000 UTC m=+1227.037330547" watchObservedRunningTime="2025-10-08 13:16:17.237816229 +0000 UTC m=+1227.046120773" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.254870 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6cc55c9557-96kwz" podStartSLOduration=2.6925047170000003 podStartE2EDuration="7.254853748s" podCreationTimestamp="2025-10-08 13:16:10 +0000 UTC" firstStartedPulling="2025-10-08 13:16:11.329172051 +0000 UTC m=+1221.137476595" lastFinishedPulling="2025-10-08 13:16:15.891521082 +0000 UTC m=+1225.699825626" observedRunningTime="2025-10-08 13:16:17.25106313 +0000 UTC m=+1227.059367664" watchObservedRunningTime="2025-10-08 13:16:17.254853748 +0000 UTC m=+1227.063158292" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.274350 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dd5468f46-ccf7d" podStartSLOduration=4.274325889 podStartE2EDuration="4.274325889s" podCreationTimestamp="2025-10-08 13:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:17.271531207 +0000 UTC m=+1227.079835751" watchObservedRunningTime="2025-10-08 13:16:17.274325889 +0000 UTC m=+1227.082630433" Oct 08 13:16:17 crc kubenswrapper[4735]: I1008 13:16:17.317967 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" podStartSLOduration=2.851988594 podStartE2EDuration="7.317940952s" podCreationTimestamp="2025-10-08 13:16:10 +0000 UTC" firstStartedPulling="2025-10-08 13:16:11.446094262 +0000 UTC m=+1221.254398816" lastFinishedPulling="2025-10-08 13:16:15.91204663 +0000 UTC m=+1225.720351174" observedRunningTime="2025-10-08 13:16:17.2964973 +0000 UTC m=+1227.104801844" watchObservedRunningTime="2025-10-08 13:16:17.317940952 +0000 UTC m=+1227.126245496" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.150132 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.151626 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.154762 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nqvr9" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.155026 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.155859 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.164237 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.312916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cql\" (UniqueName: \"kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.313364 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.313392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.314618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.409368 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:18 crc kubenswrapper[4735]: E1008 13:16:18.410234 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-p2cql openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="6dab1c85-43fb-4104-a23d-d8e1780c1280" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.416018 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cql\" (UniqueName: \"kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.416077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.416095 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.416215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.418069 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.421077 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: E1008 13:16:18.422807 4735 projected.go:194] Error preparing data for projected volume kube-api-access-p2cql for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 08 13:16:18 crc kubenswrapper[4735]: E1008 13:16:18.422862 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql podName:6dab1c85-43fb-4104-a23d-d8e1780c1280 nodeName:}" failed. No retries permitted until 2025-10-08 13:16:18.922846953 +0000 UTC m=+1228.731151497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p2cql" (UniqueName: "kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql") pod "openstackclient" (UID: "6dab1c85-43fb-4104-a23d-d8e1780c1280") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.427680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.450573 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.474579 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.475936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.482946 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.619934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.619997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48xz\" (UniqueName: \"kubernetes.io/projected/27af208c-711b-4423-bae1-91b55e84e14a-kube-api-access-m48xz\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.620055 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config-secret\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.620341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.722753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.722801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48xz\" (UniqueName: \"kubernetes.io/projected/27af208c-711b-4423-bae1-91b55e84e14a-kube-api-access-m48xz\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.722843 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config-secret\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.722935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.723777 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.730100 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config-secret\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.730329 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.743136 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48xz\" (UniqueName: \"kubernetes.io/projected/27af208c-711b-4423-bae1-91b55e84e14a-kube-api-access-m48xz\") pod \"openstackclient\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.827228 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: I1008 13:16:18.926117 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cql\" (UniqueName: \"kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql\") pod \"openstackclient\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " pod="openstack/openstackclient" Oct 08 13:16:18 crc kubenswrapper[4735]: E1008 13:16:18.928295 4735 projected.go:194] Error preparing data for projected volume kube-api-access-p2cql for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (6dab1c85-43fb-4104-a23d-d8e1780c1280) does not match the UID in record. The object might have been deleted and then recreated Oct 08 13:16:18 crc kubenswrapper[4735]: E1008 13:16:18.928363 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql podName:6dab1c85-43fb-4104-a23d-d8e1780c1280 nodeName:}" failed. No retries permitted until 2025-10-08 13:16:19.928343219 +0000 UTC m=+1229.736647763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p2cql" (UniqueName: "kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql") pod "openstackclient" (UID: "6dab1c85-43fb-4104-a23d-d8e1780c1280") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (6dab1c85-43fb-4104-a23d-d8e1780c1280) does not match the UID in record. The object might have been deleted and then recreated Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.244042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nplcf" event={"ID":"2dd43efe-433d-4590-bcf9-3304cd6e32fb","Type":"ContainerStarted","Data":"f878591b24e1383c6d3748a96ead59b9731b8220a95ec68982db1a4c5a813e03"} Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.244978 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.271081 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.274282 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6dab1c85-43fb-4104-a23d-d8e1780c1280" podUID="27af208c-711b-4423-bae1-91b55e84e14a" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.410407 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nplcf" podStartSLOduration=5.197867362 podStartE2EDuration="40.410383222s" podCreationTimestamp="2025-10-08 13:15:39 +0000 UTC" firstStartedPulling="2025-10-08 13:15:42.337997441 +0000 UTC m=+1192.146301975" lastFinishedPulling="2025-10-08 13:16:17.550513281 +0000 UTC m=+1227.358817835" observedRunningTime="2025-10-08 13:16:19.26546783 +0000 UTC m=+1229.073772394" watchObservedRunningTime="2025-10-08 13:16:19.410383222 +0000 UTC m=+1229.218687766" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.413229 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 13:16:19 crc kubenswrapper[4735]: W1008 13:16:19.413433 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27af208c_711b_4423_bae1_91b55e84e14a.slice/crio-081498e0cce4352294d229255c4bfb3faaf46a3d19140d05fd19a09a618b159b WatchSource:0}: Error finding container 081498e0cce4352294d229255c4bfb3faaf46a3d19140d05fd19a09a618b159b: Status 404 returned error can't find the container with id 081498e0cce4352294d229255c4bfb3faaf46a3d19140d05fd19a09a618b159b Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.442856 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config-secret\") pod \"6dab1c85-43fb-4104-a23d-d8e1780c1280\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.442956 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config\") pod \"6dab1c85-43fb-4104-a23d-d8e1780c1280\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.443043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-combined-ca-bundle\") pod \"6dab1c85-43fb-4104-a23d-d8e1780c1280\" (UID: \"6dab1c85-43fb-4104-a23d-d8e1780c1280\") " Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.443703 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2cql\" (UniqueName: \"kubernetes.io/projected/6dab1c85-43fb-4104-a23d-d8e1780c1280-kube-api-access-p2cql\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.444151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6dab1c85-43fb-4104-a23d-d8e1780c1280" (UID: "6dab1c85-43fb-4104-a23d-d8e1780c1280"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.447860 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6dab1c85-43fb-4104-a23d-d8e1780c1280" (UID: "6dab1c85-43fb-4104-a23d-d8e1780c1280"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.448421 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dab1c85-43fb-4104-a23d-d8e1780c1280" (UID: "6dab1c85-43fb-4104-a23d-d8e1780c1280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.545623 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.545931 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:19 crc kubenswrapper[4735]: I1008 13:16:19.545944 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dab1c85-43fb-4104-a23d-d8e1780c1280-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:20 crc kubenswrapper[4735]: I1008 13:16:20.271692 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:16:20 crc kubenswrapper[4735]: I1008 13:16:20.274665 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"27af208c-711b-4423-bae1-91b55e84e14a","Type":"ContainerStarted","Data":"081498e0cce4352294d229255c4bfb3faaf46a3d19140d05fd19a09a618b159b"} Oct 08 13:16:20 crc kubenswrapper[4735]: I1008 13:16:20.289184 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6dab1c85-43fb-4104-a23d-d8e1780c1280" podUID="27af208c-711b-4423-bae1-91b55e84e14a" Oct 08 13:16:20 crc kubenswrapper[4735]: I1008 13:16:20.924999 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dab1c85-43fb-4104-a23d-d8e1780c1280" path="/var/lib/kubelet/pods/6dab1c85-43fb-4104-a23d-d8e1780c1280/volumes" Oct 08 13:16:22 crc kubenswrapper[4735]: I1008 13:16:22.445493 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:22 crc kubenswrapper[4735]: I1008 13:16:22.483305 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.716385 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-675db45d77-s6pvt"] Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.719791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.725801 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.726210 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.726378 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.737065 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-675db45d77-s6pvt"] Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.835587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthzs\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-kube-api-access-jthzs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.835647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-log-httpd\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.835677 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-public-tls-certs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.835956 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-config-data\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.836012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-internal-tls-certs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.836251 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-etc-swift\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.836314 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-run-httpd\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.836389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.938198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-config-data\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.938249 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-internal-tls-certs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.938298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-etc-swift\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.938319 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-run-httpd\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.938366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.939399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-run-httpd\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.939438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-log-httpd\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.939471 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthzs\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-kube-api-access-jthzs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.939805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-public-tls-certs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.944726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-log-httpd\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.945846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.945953 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-internal-tls-certs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.946151 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-config-data\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.946880 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-public-tls-certs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.948711 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-etc-swift\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:23 crc kubenswrapper[4735]: I1008 13:16:23.954993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthzs\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-kube-api-access-jthzs\") pod \"swift-proxy-675db45d77-s6pvt\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.044960 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.345125 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" containerID="f878591b24e1383c6d3748a96ead59b9731b8220a95ec68982db1a4c5a813e03" exitCode=0 Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.345470 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nplcf" event={"ID":"2dd43efe-433d-4590-bcf9-3304cd6e32fb","Type":"ContainerDied","Data":"f878591b24e1383c6d3748a96ead59b9731b8220a95ec68982db1a4c5a813e03"} Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.535238 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.611203 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.611647 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-central-agent" containerID="cri-o://22c487685c2fc6697154b1f57342d68c199c65dfda42b1c9b316dbf6547b7d38" gracePeriod=30 Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.612148 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="proxy-httpd" containerID="cri-o://965f48a9d18436349fa306565e27d670df85717967b4f8a617abda501c600caf" gracePeriod=30 Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.612212 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="sg-core" containerID="cri-o://7694171eb43247b8cd15f87f146618756899c9babaf01cc45fc5d4d7c78cea6a" gracePeriod=30 Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.612278 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-notification-agent" containerID="cri-o://c4630f113a494d80dafdc30d9af150e66a89aa014788f6211dc33731b3b851a2" gracePeriod=30 Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.620805 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.752617 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-675db45d77-s6pvt"] Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.880103 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:16:24 crc kubenswrapper[4735]: I1008 13:16:24.880439 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.361035 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerID="965f48a9d18436349fa306565e27d670df85717967b4f8a617abda501c600caf" exitCode=0 Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.361068 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerID="7694171eb43247b8cd15f87f146618756899c9babaf01cc45fc5d4d7c78cea6a" exitCode=2 Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.361076 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerID="22c487685c2fc6697154b1f57342d68c199c65dfda42b1c9b316dbf6547b7d38" exitCode=0 Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.361117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerDied","Data":"965f48a9d18436349fa306565e27d670df85717967b4f8a617abda501c600caf"} Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.361169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerDied","Data":"7694171eb43247b8cd15f87f146618756899c9babaf01cc45fc5d4d7c78cea6a"} Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.361183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerDied","Data":"22c487685c2fc6697154b1f57342d68c199c65dfda42b1c9b316dbf6547b7d38"} Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.758630 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.827027 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-89cn6"] Oct 08 13:16:25 crc kubenswrapper[4735]: I1008 13:16:25.827260 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="dnsmasq-dns" containerID="cri-o://e359b1b1bb8b10520bccf2003e64ee5c5fe849c5b9fc17e005ca7f7931793bd8" gracePeriod=10 Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.153596 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.382917 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerID="c4630f113a494d80dafdc30d9af150e66a89aa014788f6211dc33731b3b851a2" exitCode=0 Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.382950 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerDied","Data":"c4630f113a494d80dafdc30d9af150e66a89aa014788f6211dc33731b3b851a2"} Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.388256 4735 generic.go:334] "Generic (PLEG): container finished" podID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerID="e359b1b1bb8b10520bccf2003e64ee5c5fe849c5b9fc17e005ca7f7931793bd8" exitCode=0 Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.388299 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" event={"ID":"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7","Type":"ContainerDied","Data":"e359b1b1bb8b10520bccf2003e64ee5c5fe849c5b9fc17e005ca7f7931793bd8"} Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.427194 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.492547 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c7cdb8d88-4pp88"] Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.492822 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c7cdb8d88-4pp88" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api-log" containerID="cri-o://847162f47f9b5a78983f8926a56887b9861cbdfc9236643123b3318b3f0e9f0a" gracePeriod=30 Oct 08 13:16:26 crc kubenswrapper[4735]: I1008 13:16:26.492935 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c7cdb8d88-4pp88" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api" containerID="cri-o://33f936d2332ce1aa95d93d854bd325a0ca6212966a7597aabdad32083ac2b2dd" gracePeriod=30 Oct 08 13:16:27 crc kubenswrapper[4735]: I1008 13:16:27.293315 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:16:27 crc kubenswrapper[4735]: I1008 13:16:27.400344 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-698444467b-bvdzb"] Oct 08 13:16:27 crc kubenswrapper[4735]: I1008 13:16:27.400590 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-698444467b-bvdzb" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-api" containerID="cri-o://86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233" gracePeriod=30 Oct 08 13:16:27 crc kubenswrapper[4735]: I1008 13:16:27.401000 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-698444467b-bvdzb" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-httpd" containerID="cri-o://0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0" gracePeriod=30 Oct 08 13:16:27 crc kubenswrapper[4735]: I1008 13:16:27.454993 4735 generic.go:334] "Generic (PLEG): container finished" podID="4aa1710f-6005-4ce1-901f-d71442736e68" containerID="847162f47f9b5a78983f8926a56887b9861cbdfc9236643123b3318b3f0e9f0a" exitCode=143 Oct 08 13:16:27 crc kubenswrapper[4735]: I1008 13:16:27.455083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c7cdb8d88-4pp88" event={"ID":"4aa1710f-6005-4ce1-901f-d71442736e68","Type":"ContainerDied","Data":"847162f47f9b5a78983f8926a56887b9861cbdfc9236643123b3318b3f0e9f0a"} Oct 08 13:16:28 crc kubenswrapper[4735]: I1008 13:16:28.438799 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Oct 08 13:16:28 crc kubenswrapper[4735]: I1008 13:16:28.468004 4735 generic.go:334] "Generic (PLEG): container finished" podID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerID="0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0" exitCode=0 Oct 08 13:16:28 crc kubenswrapper[4735]: I1008 13:16:28.468049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-698444467b-bvdzb" event={"ID":"daeb6ee4-aafc-4251-9512-43c8cde46c02","Type":"ContainerDied","Data":"0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0"} Oct 08 13:16:29 crc kubenswrapper[4735]: I1008 13:16:29.780776 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:29 crc kubenswrapper[4735]: I1008 13:16:29.781358 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-log" containerID="cri-o://4cdd9cd0992efec7fcd98ab9b6975f92c9411e342b5181c59ac77e76ba43b22d" gracePeriod=30 Oct 08 13:16:29 crc kubenswrapper[4735]: I1008 13:16:29.781414 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-httpd" containerID="cri-o://1d0e5a0f7e8bf01a768ba2a2637fc04c6305e74a200d36d9b6c9989a59e6e6ed" gracePeriod=30 Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.487400 4735 generic.go:334] "Generic (PLEG): container finished" podID="4aa1710f-6005-4ce1-901f-d71442736e68" containerID="33f936d2332ce1aa95d93d854bd325a0ca6212966a7597aabdad32083ac2b2dd" exitCode=0 Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.487556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c7cdb8d88-4pp88" event={"ID":"4aa1710f-6005-4ce1-901f-d71442736e68","Type":"ContainerDied","Data":"33f936d2332ce1aa95d93d854bd325a0ca6212966a7597aabdad32083ac2b2dd"} Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.491658 4735 generic.go:334] "Generic (PLEG): container finished" podID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerID="4cdd9cd0992efec7fcd98ab9b6975f92c9411e342b5181c59ac77e76ba43b22d" exitCode=143 Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.491690 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af8d6fa0-e75f-40fe-92e9-284df31eae95","Type":"ContainerDied","Data":"4cdd9cd0992efec7fcd98ab9b6975f92c9411e342b5181c59ac77e76ba43b22d"} Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.650678 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cqm49"] Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.653329 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.687468 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cqm49"] Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.689624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lgf\" (UniqueName: \"kubernetes.io/projected/85383db5-f6cc-433b-b8d9-7b5397aeadd4-kube-api-access-w8lgf\") pod \"nova-api-db-create-cqm49\" (UID: \"85383db5-f6cc-433b-b8d9-7b5397aeadd4\") " pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.783416 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c7cdb8d88-4pp88" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.783708 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c7cdb8d88-4pp88" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.799293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lgf\" (UniqueName: \"kubernetes.io/projected/85383db5-f6cc-433b-b8d9-7b5397aeadd4-kube-api-access-w8lgf\") pod \"nova-api-db-create-cqm49\" (UID: \"85383db5-f6cc-433b-b8d9-7b5397aeadd4\") " pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.824292 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lgf\" (UniqueName: \"kubernetes.io/projected/85383db5-f6cc-433b-b8d9-7b5397aeadd4-kube-api-access-w8lgf\") pod \"nova-api-db-create-cqm49\" (UID: \"85383db5-f6cc-433b-b8d9-7b5397aeadd4\") " pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.865684 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-49lbk"] Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.868664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.885082 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-49lbk"] Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.904995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5hhc\" (UniqueName: \"kubernetes.io/projected/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1-kube-api-access-j5hhc\") pod \"nova-cell0-db-create-49lbk\" (UID: \"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1\") " pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.936607 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.938349 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-log" containerID="cri-o://5bf07e71f83ec42daf31838c9792cb61bff2c3545d23d5b759abc92cdc49cd03" gracePeriod=30 Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.938569 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-httpd" containerID="cri-o://195a6ffbbaaa312e7578490fa0e19d1ab01f295407b49eb14d400842de28a9ad" gracePeriod=30 Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.961899 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9h9hg"] Oct 08 13:16:30 crc kubenswrapper[4735]: I1008 13:16:30.963092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:30.993618 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9h9hg"] Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.007557 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9c2r\" (UniqueName: \"kubernetes.io/projected/b5d95702-991f-4f45-94e3-45650c900b1a-kube-api-access-h9c2r\") pod \"nova-cell1-db-create-9h9hg\" (UID: \"b5d95702-991f-4f45-94e3-45650c900b1a\") " pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.007692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5hhc\" (UniqueName: \"kubernetes.io/projected/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1-kube-api-access-j5hhc\") pod \"nova-cell0-db-create-49lbk\" (UID: \"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1\") " pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.063882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5hhc\" (UniqueName: \"kubernetes.io/projected/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1-kube-api-access-j5hhc\") pod \"nova-cell0-db-create-49lbk\" (UID: \"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1\") " pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.085932 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.114714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9c2r\" (UniqueName: \"kubernetes.io/projected/b5d95702-991f-4f45-94e3-45650c900b1a-kube-api-access-h9c2r\") pod \"nova-cell1-db-create-9h9hg\" (UID: \"b5d95702-991f-4f45-94e3-45650c900b1a\") " pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.150113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9c2r\" (UniqueName: \"kubernetes.io/projected/b5d95702-991f-4f45-94e3-45650c900b1a-kube-api-access-h9c2r\") pod \"nova-cell1-db-create-9h9hg\" (UID: \"b5d95702-991f-4f45-94e3-45650c900b1a\") " pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.229074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.330667 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.510723 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerID="5bf07e71f83ec42daf31838c9792cb61bff2c3545d23d5b759abc92cdc49cd03" exitCode=143 Oct 08 13:16:31 crc kubenswrapper[4735]: I1008 13:16:31.510773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee831ac-1cdb-4271-b7a3-054313f1d20b","Type":"ContainerDied","Data":"5bf07e71f83ec42daf31838c9792cb61bff2c3545d23d5b759abc92cdc49cd03"} Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.400715 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nplcf" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.444868 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-db-sync-config-data\") pod \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.444954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-combined-ca-bundle\") pod \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.445054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mvxm\" (UniqueName: \"kubernetes.io/projected/2dd43efe-433d-4590-bcf9-3304cd6e32fb-kube-api-access-6mvxm\") pod \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.445116 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-scripts\") pod \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.445158 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-config-data\") pod \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.445409 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dd43efe-433d-4590-bcf9-3304cd6e32fb-etc-machine-id\") pod \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\" (UID: \"2dd43efe-433d-4590-bcf9-3304cd6e32fb\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.445863 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dd43efe-433d-4590-bcf9-3304cd6e32fb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2dd43efe-433d-4590-bcf9-3304cd6e32fb" (UID: "2dd43efe-433d-4590-bcf9-3304cd6e32fb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.463694 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-scripts" (OuterVolumeSpecName: "scripts") pod "2dd43efe-433d-4590-bcf9-3304cd6e32fb" (UID: "2dd43efe-433d-4590-bcf9-3304cd6e32fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.467671 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd43efe-433d-4590-bcf9-3304cd6e32fb-kube-api-access-6mvxm" (OuterVolumeSpecName: "kube-api-access-6mvxm") pod "2dd43efe-433d-4590-bcf9-3304cd6e32fb" (UID: "2dd43efe-433d-4590-bcf9-3304cd6e32fb"). InnerVolumeSpecName "kube-api-access-6mvxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.470488 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2dd43efe-433d-4590-bcf9-3304cd6e32fb" (UID: "2dd43efe-433d-4590-bcf9-3304cd6e32fb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.535657 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dd43efe-433d-4590-bcf9-3304cd6e32fb" (UID: "2dd43efe-433d-4590-bcf9-3304cd6e32fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.549087 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.549126 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.549141 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mvxm\" (UniqueName: \"kubernetes.io/projected/2dd43efe-433d-4590-bcf9-3304cd6e32fb-kube-api-access-6mvxm\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.549156 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.549167 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dd43efe-433d-4590-bcf9-3304cd6e32fb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.609341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-config-data" (OuterVolumeSpecName: "config-data") pod "2dd43efe-433d-4590-bcf9-3304cd6e32fb" (UID: "2dd43efe-433d-4590-bcf9-3304cd6e32fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.650526 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-675db45d77-s6pvt" event={"ID":"5516bd19-016a-4d52-84c9-b3801e1841ca","Type":"ContainerStarted","Data":"17a3af001ccddc82de589584693b2639a7495ab66189f505abc7e05707f5a7d2"} Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.651543 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd43efe-433d-4590-bcf9-3304cd6e32fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.659403 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nplcf" event={"ID":"2dd43efe-433d-4590-bcf9-3304cd6e32fb","Type":"ContainerDied","Data":"4fb97d60ce38a371f044ac6c75456387da39b07afc5a66da10e756201fbd36a4"} Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.659448 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb97d60ce38a371f044ac6c75456387da39b07afc5a66da10e756201fbd36a4" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.659531 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nplcf" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.757414 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.757720 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="add5914b-96e1-4d52-8648-d3429ea33b0d" containerName="kube-state-metrics" containerID="cri-o://9ee54bf2aa7fa2d2788207107b556c20ed7c52eb424aeccd9ddf2b2c0d2f7eb9" gracePeriod=30 Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.829222 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.959056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-sg-core-conf-yaml\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.959361 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-run-httpd\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.959508 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gwp6\" (UniqueName: \"kubernetes.io/projected/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-kube-api-access-5gwp6\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.959762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-combined-ca-bundle\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.959905 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-config-data\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.959933 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-log-httpd\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.960052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-scripts\") pod \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\" (UID: \"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7\") " Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.961734 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.963632 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.967503 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-kube-api-access-5gwp6" (OuterVolumeSpecName: "kube-api-access-5gwp6") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "kube-api-access-5gwp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.969329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-scripts" (OuterVolumeSpecName: "scripts") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:32 crc kubenswrapper[4735]: I1008 13:16:32.974109 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.062347 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data-custom\") pod \"4aa1710f-6005-4ce1-901f-d71442736e68\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.062450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aa1710f-6005-4ce1-901f-d71442736e68-logs\") pod \"4aa1710f-6005-4ce1-901f-d71442736e68\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.062502 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data\") pod \"4aa1710f-6005-4ce1-901f-d71442736e68\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.062646 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-combined-ca-bundle\") pod \"4aa1710f-6005-4ce1-901f-d71442736e68\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.062695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khxfc\" (UniqueName: \"kubernetes.io/projected/4aa1710f-6005-4ce1-901f-d71442736e68-kube-api-access-khxfc\") pod \"4aa1710f-6005-4ce1-901f-d71442736e68\" (UID: \"4aa1710f-6005-4ce1-901f-d71442736e68\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.064110 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.064137 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.064152 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.064162 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gwp6\" (UniqueName: \"kubernetes.io/projected/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-kube-api-access-5gwp6\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.064715 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa1710f-6005-4ce1-901f-d71442736e68-logs" (OuterVolumeSpecName: "logs") pod "4aa1710f-6005-4ce1-901f-d71442736e68" (UID: "4aa1710f-6005-4ce1-901f-d71442736e68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.074504 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.075429 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa1710f-6005-4ce1-901f-d71442736e68-kube-api-access-khxfc" (OuterVolumeSpecName: "kube-api-access-khxfc") pod "4aa1710f-6005-4ce1-901f-d71442736e68" (UID: "4aa1710f-6005-4ce1-901f-d71442736e68"). InnerVolumeSpecName "kube-api-access-khxfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.101451 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4aa1710f-6005-4ce1-901f-d71442736e68" (UID: "4aa1710f-6005-4ce1-901f-d71442736e68"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.169176 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-nb\") pod \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.169281 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-sb\") pod \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.169337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-config\") pod \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.169391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-swift-storage-0\") pod \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.169420 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmrth\" (UniqueName: \"kubernetes.io/projected/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-kube-api-access-wmrth\") pod \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.169550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-svc\") pod \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\" (UID: \"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7\") " Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.170048 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khxfc\" (UniqueName: \"kubernetes.io/projected/4aa1710f-6005-4ce1-901f-d71442736e68-kube-api-access-khxfc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.170070 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.170082 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aa1710f-6005-4ce1-901f-d71442736e68-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.226425 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-kube-api-access-wmrth" (OuterVolumeSpecName: "kube-api-access-wmrth") pod "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" (UID: "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7"). InnerVolumeSpecName "kube-api-access-wmrth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.237064 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cqm49"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.266992 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-49lbk"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.272865 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmrth\" (UniqueName: \"kubernetes.io/projected/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-kube-api-access-wmrth\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.303924 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": dial tcp 10.217.0.154:9292: connect: connection refused" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.304263 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": dial tcp 10.217.0.154:9292: connect: connection refused" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.360328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.382907 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.387309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4aa1710f-6005-4ce1-901f-d71442736e68" (UID: "4aa1710f-6005-4ce1-901f-d71442736e68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.485375 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.553813 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data" (OuterVolumeSpecName: "config-data") pod "4aa1710f-6005-4ce1-901f-d71442736e68" (UID: "4aa1710f-6005-4ce1-901f-d71442736e68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.595885 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aa1710f-6005-4ce1-901f-d71442736e68-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.667410 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9h9hg"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.738542 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739261 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="sg-core" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739276 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="sg-core" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739304 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739312 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739336 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" containerName="cinder-db-sync" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739344 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" containerName="cinder-db-sync" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739362 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api-log" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739372 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api-log" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739384 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-central-agent" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739392 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-central-agent" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739402 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-notification-agent" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739409 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-notification-agent" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739428 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="proxy-httpd" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739455 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="proxy-httpd" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739464 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="init" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739485 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="init" Oct 08 13:16:33 crc kubenswrapper[4735]: E1008 13:16:33.739494 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="dnsmasq-dns" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739502 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="dnsmasq-dns" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739703 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-notification-agent" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739723 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api-log" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739731 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="sg-core" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739740 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" containerName="cinder-db-sync" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739747 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="proxy-httpd" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739756 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" containerName="dnsmasq-dns" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739774 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" containerName="barbican-api" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.739786 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" containerName="ceilometer-central-agent" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.740990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.760850 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.783658 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.783837 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vzr64" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.783996 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.787851 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.798910 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-4rn4d"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.800536 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.822049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.822377 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.822513 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.822543 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.822679 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx7lw\" (UniqueName: \"kubernetes.io/projected/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-kube-api-access-xx7lw\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.822713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.823010 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-4rn4d"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.835267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee46d28d-61ac-41d0-95da-9a66bf7c4ed7","Type":"ContainerDied","Data":"91c3850af88b3796508c3f24ce83d82adaf9553bb6e8ec39c3b8f7a46acdd12a"} Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.835321 4735 scope.go:117] "RemoveContainer" containerID="965f48a9d18436349fa306565e27d670df85717967b4f8a617abda501c600caf" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.835499 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.862041 4735 generic.go:334] "Generic (PLEG): container finished" podID="add5914b-96e1-4d52-8648-d3429ea33b0d" containerID="9ee54bf2aa7fa2d2788207107b556c20ed7c52eb424aeccd9ddf2b2c0d2f7eb9" exitCode=2 Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.862144 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"add5914b-96e1-4d52-8648-d3429ea33b0d","Type":"ContainerDied","Data":"9ee54bf2aa7fa2d2788207107b556c20ed7c52eb424aeccd9ddf2b2c0d2f7eb9"} Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.894596 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c7cdb8d88-4pp88" event={"ID":"4aa1710f-6005-4ce1-901f-d71442736e68","Type":"ContainerDied","Data":"95c4126a635fa6e5ab4529c2917a5b400b1d70f3715113ca49b64185a64f61d2"} Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.894739 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c7cdb8d88-4pp88" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926256 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-svc\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsghf\" (UniqueName: \"kubernetes.io/projected/6819ca75-acc5-4a08-93d0-c58685908257-kube-api-access-wsghf\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-config\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926669 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx7lw\" (UniqueName: \"kubernetes.io/projected/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-kube-api-access-xx7lw\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.926726 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.929165 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.936399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-675db45d77-s6pvt" event={"ID":"5516bd19-016a-4d52-84c9-b3801e1841ca","Type":"ContainerStarted","Data":"c41fba7730a1ea0efa3f14c62882c27ae55360beb38b9829009593be86e45cb8"} Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.937515 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.937548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.976212 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" (UID: "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.988175 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.989933 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.993890 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.994040 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"27af208c-711b-4423-bae1-91b55e84e14a","Type":"ContainerStarted","Data":"ce728a0dd44fff56f85bc83d0ecfb0de2a06e0305143dfb8dc7aa43bf3d0dbe6"} Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.995790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:33 crc kubenswrapper[4735]: I1008 13:16:33.996094 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx7lw\" (UniqueName: \"kubernetes.io/projected/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-kube-api-access-xx7lw\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.011142 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.013581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.018187 4735 generic.go:334] "Generic (PLEG): container finished" podID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerID="1d0e5a0f7e8bf01a768ba2a2637fc04c6305e74a200d36d9b6c9989a59e6e6ed" exitCode=0 Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.018238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af8d6fa0-e75f-40fe-92e9-284df31eae95","Type":"ContainerDied","Data":"1d0e5a0f7e8bf01a768ba2a2637fc04c6305e74a200d36d9b6c9989a59e6e6ed"} Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.018529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.019569 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqm49" event={"ID":"85383db5-f6cc-433b-b8d9-7b5397aeadd4","Type":"ContainerStarted","Data":"d43d2eca9e2d5078166ef4561d8ac155c46687584345cade59878d27243a3812"} Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.024377 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.027726 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-675db45d77-s6pvt" podStartSLOduration=11.027707576 podStartE2EDuration="11.027707576s" podCreationTimestamp="2025-10-08 13:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:33.993404797 +0000 UTC m=+1243.801709351" watchObservedRunningTime="2025-10-08 13:16:34.027707576 +0000 UTC m=+1243.836012120" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.033437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-49lbk" event={"ID":"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1","Type":"ContainerStarted","Data":"fa3be51aaaf46b6750350bb7c0e6d7625cc9ff357eb51fd7a75eeeea7f3fb3a6"} Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.034752 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsghf\" (UniqueName: \"kubernetes.io/projected/6819ca75-acc5-4a08-93d0-c58685908257-kube-api-access-wsghf\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.034862 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-config\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.034913 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.034990 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.035032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.035099 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-svc\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.035153 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.036202 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-config\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.038184 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.038632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-svc\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.038847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.047720 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.051557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-config" (OuterVolumeSpecName: "config") pod "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" (UID: "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.062147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" event={"ID":"d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7","Type":"ContainerDied","Data":"eb7dd8c8243482cba024f1e98fde2bf4eff726e393bd94dfd14c1c82abf76ed7"} Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.062245 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-89cn6" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.062247 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.014016457 podStartE2EDuration="16.06223494s" podCreationTimestamp="2025-10-08 13:16:18 +0000 UTC" firstStartedPulling="2025-10-08 13:16:19.415784101 +0000 UTC m=+1229.224088645" lastFinishedPulling="2025-10-08 13:16:32.464002584 +0000 UTC m=+1242.272307128" observedRunningTime="2025-10-08 13:16:34.033525786 +0000 UTC m=+1243.841830330" watchObservedRunningTime="2025-10-08 13:16:34.06223494 +0000 UTC m=+1243.870539484" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.100114 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsghf\" (UniqueName: \"kubernetes.io/projected/6819ca75-acc5-4a08-93d0-c58685908257-kube-api-access-wsghf\") pod \"dnsmasq-dns-84bd785c49-4rn4d\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.100557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" (UID: "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.110820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" (UID: "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.119374 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-config-data" (OuterVolumeSpecName: "config-data") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.126100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" (UID: "ee46d28d-61ac-41d0-95da-9a66bf7c4ed7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.132668 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" (UID: "d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.144811 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data-custom\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.144931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1670833a-54b6-4088-bcf1-db1daecd4fb4-logs\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145209 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-scripts\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7ww8\" (UniqueName: \"kubernetes.io/projected/1670833a-54b6-4088-bcf1-db1daecd4fb4-kube-api-access-b7ww8\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145267 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1670833a-54b6-4088-bcf1-db1daecd4fb4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145375 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145391 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145406 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145418 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145429 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.145442 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246502 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1670833a-54b6-4088-bcf1-db1daecd4fb4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data-custom\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1670833a-54b6-4088-bcf1-db1daecd4fb4-logs\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-scripts\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.246745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7ww8\" (UniqueName: \"kubernetes.io/projected/1670833a-54b6-4088-bcf1-db1daecd4fb4-kube-api-access-b7ww8\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.247045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1670833a-54b6-4088-bcf1-db1daecd4fb4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.248128 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1670833a-54b6-4088-bcf1-db1daecd4fb4-logs\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.254389 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data-custom\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.254961 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.257215 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-scripts\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.276809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7ww8\" (UniqueName: \"kubernetes.io/projected/1670833a-54b6-4088-bcf1-db1daecd4fb4-kube-api-access-b7ww8\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.277811 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data\") pod \"cinder-api-0\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.374709 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:51492->10.217.0.153:9292: read: connection reset by peer" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.374766 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:51498->10.217.0.153:9292: read: connection reset by peer" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.393099 4735 scope.go:117] "RemoveContainer" containerID="7694171eb43247b8cd15f87f146618756899c9babaf01cc45fc5d4d7c78cea6a" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.505333 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.598997 4735 scope.go:117] "RemoveContainer" containerID="c4630f113a494d80dafdc30d9af150e66a89aa014788f6211dc33731b3b851a2" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.627326 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.637562 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.639884 4735 scope.go:117] "RemoveContainer" containerID="22c487685c2fc6697154b1f57342d68c199c65dfda42b1c9b316dbf6547b7d38" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.646769 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-89cn6"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.653873 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x86hz\" (UniqueName: \"kubernetes.io/projected/add5914b-96e1-4d52-8648-d3429ea33b0d-kube-api-access-x86hz\") pod \"add5914b-96e1-4d52-8648-d3429ea33b0d\" (UID: \"add5914b-96e1-4d52-8648-d3429ea33b0d\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.661975 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-89cn6"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.668315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add5914b-96e1-4d52-8648-d3429ea33b0d-kube-api-access-x86hz" (OuterVolumeSpecName: "kube-api-access-x86hz") pod "add5914b-96e1-4d52-8648-d3429ea33b0d" (UID: "add5914b-96e1-4d52-8648-d3429ea33b0d"). InnerVolumeSpecName "kube-api-access-x86hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.670501 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.674046 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c7cdb8d88-4pp88"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.689613 4735 scope.go:117] "RemoveContainer" containerID="33f936d2332ce1aa95d93d854bd325a0ca6212966a7597aabdad32083ac2b2dd" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.714936 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c7cdb8d88-4pp88"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.730625 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:34 crc kubenswrapper[4735]: E1008 13:16:34.731148 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add5914b-96e1-4d52-8648-d3429ea33b0d" containerName="kube-state-metrics" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.731173 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="add5914b-96e1-4d52-8648-d3429ea33b0d" containerName="kube-state-metrics" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.733152 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="add5914b-96e1-4d52-8648-d3429ea33b0d" containerName="kube-state-metrics" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.763243 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x86hz\" (UniqueName: \"kubernetes.io/projected/add5914b-96e1-4d52-8648-d3429ea33b0d-kube-api-access-x86hz\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.766837 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.766965 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.773182 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.773546 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.817729 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.855484 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.867602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868359 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wbw\" (UniqueName: \"kubernetes.io/projected/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-kube-api-access-l8wbw\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-scripts\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868526 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-run-httpd\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868623 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-config-data\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.868654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-log-httpd\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.929112 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa1710f-6005-4ce1-901f-d71442736e68" path="/var/lib/kubelet/pods/4aa1710f-6005-4ce1-901f-d71442736e68/volumes" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.929771 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7" path="/var/lib/kubelet/pods/d7781b92-9f3c-4c5b-ba4f-a6d7f64f85a7/volumes" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.930336 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee46d28d-61ac-41d0-95da-9a66bf7c4ed7" path="/var/lib/kubelet/pods/ee46d28d-61ac-41d0-95da-9a66bf7c4ed7/volumes" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.964911 4735 scope.go:117] "RemoveContainer" containerID="847162f47f9b5a78983f8926a56887b9861cbdfc9236643123b3318b3f0e9f0a" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969334 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-httpd-run\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969383 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-scripts\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969426 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969455 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-combined-ca-bundle\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969499 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-public-tls-certs\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969530 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-logs\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969657 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7ws\" (UniqueName: \"kubernetes.io/projected/af8d6fa0-e75f-40fe-92e9-284df31eae95-kube-api-access-9f7ws\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-config-data\") pod \"af8d6fa0-e75f-40fe-92e9-284df31eae95\" (UID: \"af8d6fa0-e75f-40fe-92e9-284df31eae95\") " Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.969959 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-config-data\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.970006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-log-httpd\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.970094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wbw\" (UniqueName: \"kubernetes.io/projected/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-kube-api-access-l8wbw\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.970129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-scripts\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.970161 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-run-httpd\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.970191 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.971134 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.971771 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-log-httpd\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.974546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-run-httpd\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.975169 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-logs" (OuterVolumeSpecName: "logs") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.980694 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:34 crc kubenswrapper[4735]: I1008 13:16:34.987458 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.004856 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-config-data\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.006189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.006282 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-scripts" (OuterVolumeSpecName: "scripts") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.006409 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8d6fa0-e75f-40fe-92e9-284df31eae95-kube-api-access-9f7ws" (OuterVolumeSpecName: "kube-api-access-9f7ws") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "kube-api-access-9f7ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.012326 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-scripts\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.025960 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wbw\" (UniqueName: \"kubernetes.io/projected/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-kube-api-access-l8wbw\") pod \"ceilometer-0\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " pod="openstack/ceilometer-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.028374 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.034704 4735 scope.go:117] "RemoveContainer" containerID="e359b1b1bb8b10520bccf2003e64ee5c5fe849c5b9fc17e005ca7f7931793bd8" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.047363 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.079542 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f7ws\" (UniqueName: \"kubernetes.io/projected/af8d6fa0-e75f-40fe-92e9-284df31eae95-kube-api-access-9f7ws\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.079570 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.079582 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.079612 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.079624 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.079634 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af8d6fa0-e75f-40fe-92e9-284df31eae95-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.113542 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.114185 4735 generic.go:334] "Generic (PLEG): container finished" podID="e203cdd5-6f7f-4e2e-9be1-3d467c6005f1" containerID="6868ac5109c85f569ae6a98b468fee4259afd8e4d4e83f57546cf9baa70e2595" exitCode=0 Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.114244 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-49lbk" event={"ID":"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1","Type":"ContainerDied","Data":"6868ac5109c85f569ae6a98b468fee4259afd8e4d4e83f57546cf9baa70e2595"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.121974 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-675db45d77-s6pvt" event={"ID":"5516bd19-016a-4d52-84c9-b3801e1841ca","Type":"ContainerStarted","Data":"eb379c44e7f01a76229af5fa811391aa9b991614e3a1cc0d30c5cec76be16709"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.124489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"af8d6fa0-e75f-40fe-92e9-284df31eae95","Type":"ContainerDied","Data":"6bfb11afcb1a3a7eedc99e0e4bd2b112e915ca0b0433433cb81444f9ad9a1290"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.128691 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.155524 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-config-data" (OuterVolumeSpecName: "config-data") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.156493 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerID="195a6ffbbaaa312e7578490fa0e19d1ab01f295407b49eb14d400842de28a9ad" exitCode=0 Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.156605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee831ac-1cdb-4271-b7a3-054313f1d20b","Type":"ContainerDied","Data":"195a6ffbbaaa312e7578490fa0e19d1ab01f295407b49eb14d400842de28a9ad"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.160639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqm49" event={"ID":"85383db5-f6cc-433b-b8d9-7b5397aeadd4","Type":"ContainerStarted","Data":"ec5c9443ed714cead8c92fc82a4368ded3008d6ad808a146f6e44d78449d3dce"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.164329 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.164333 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"add5914b-96e1-4d52-8648-d3429ea33b0d","Type":"ContainerDied","Data":"f8cbee69405022076c2306e8dec490c25b0c31b087e1c6072aa140749ef931b3"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.171308 4735 generic.go:334] "Generic (PLEG): container finished" podID="b5d95702-991f-4f45-94e3-45650c900b1a" containerID="64dfaa064ce1bff941b3220dbad48d2a7c10dcdf642bb16be664f91279669cb6" exitCode=0 Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.171541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9h9hg" event={"ID":"b5d95702-991f-4f45-94e3-45650c900b1a","Type":"ContainerDied","Data":"64dfaa064ce1bff941b3220dbad48d2a7c10dcdf642bb16be664f91279669cb6"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.171620 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9h9hg" event={"ID":"b5d95702-991f-4f45-94e3-45650c900b1a","Type":"ContainerStarted","Data":"fe695692fc60a4d01063aaec8110f1596cf1dc4c08afd1eff56975e64f23d430"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.177798 4735 generic.go:334] "Generic (PLEG): container finished" podID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerID="86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233" exitCode=0 Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.177898 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-698444467b-bvdzb" event={"ID":"daeb6ee4-aafc-4251-9512-43c8cde46c02","Type":"ContainerDied","Data":"86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.177927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-698444467b-bvdzb" event={"ID":"daeb6ee4-aafc-4251-9512-43c8cde46c02","Type":"ContainerDied","Data":"35833a782344be4576d1fad4630969d8b73b2598b4a07567ddcb025163cab13e"} Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.178019 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-698444467b-bvdzb" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.180416 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz8xl\" (UniqueName: \"kubernetes.io/projected/daeb6ee4-aafc-4251-9512-43c8cde46c02-kube-api-access-jz8xl\") pod \"daeb6ee4-aafc-4251-9512-43c8cde46c02\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.180465 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-config\") pod \"daeb6ee4-aafc-4251-9512-43c8cde46c02\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.180575 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-httpd-config\") pod \"daeb6ee4-aafc-4251-9512-43c8cde46c02\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.180605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-combined-ca-bundle\") pod \"daeb6ee4-aafc-4251-9512-43c8cde46c02\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.180707 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-ovndb-tls-certs\") pod \"daeb6ee4-aafc-4251-9512-43c8cde46c02\" (UID: \"daeb6ee4-aafc-4251-9512-43c8cde46c02\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.190209 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.190252 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.197779 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "daeb6ee4-aafc-4251-9512-43c8cde46c02" (UID: "daeb6ee4-aafc-4251-9512-43c8cde46c02"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.208994 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.228669 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "af8d6fa0-e75f-40fe-92e9-284df31eae95" (UID: "af8d6fa0-e75f-40fe-92e9-284df31eae95"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.233659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daeb6ee4-aafc-4251-9512-43c8cde46c02-kube-api-access-jz8xl" (OuterVolumeSpecName: "kube-api-access-jz8xl") pod "daeb6ee4-aafc-4251-9512-43c8cde46c02" (UID: "daeb6ee4-aafc-4251-9512-43c8cde46c02"). InnerVolumeSpecName "kube-api-access-jz8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.244247 4735 scope.go:117] "RemoveContainer" containerID="f2d2917edfcdae896262be4cd1de7fce5374bd3c8b3e62fb0b67457014bbb1fa" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.294027 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz8xl\" (UniqueName: \"kubernetes.io/projected/daeb6ee4-aafc-4251-9512-43c8cde46c02-kube-api-access-jz8xl\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.296326 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8d6fa0-e75f-40fe-92e9-284df31eae95-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.296401 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.294844 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-config" (OuterVolumeSpecName: "config") pod "daeb6ee4-aafc-4251-9512-43c8cde46c02" (UID: "daeb6ee4-aafc-4251-9512-43c8cde46c02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.349293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daeb6ee4-aafc-4251-9512-43c8cde46c02" (UID: "daeb6ee4-aafc-4251-9512-43c8cde46c02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.399834 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.399867 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.449577 4735 scope.go:117] "RemoveContainer" containerID="1d0e5a0f7e8bf01a768ba2a2637fc04c6305e74a200d36d9b6c9989a59e6e6ed" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.485610 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "daeb6ee4-aafc-4251-9512-43c8cde46c02" (UID: "daeb6ee4-aafc-4251-9512-43c8cde46c02"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.501186 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daeb6ee4-aafc-4251-9512-43c8cde46c02-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.518115 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.521808 4735 scope.go:117] "RemoveContainer" containerID="4cdd9cd0992efec7fcd98ab9b6975f92c9411e342b5181c59ac77e76ba43b22d" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.571923 4735 scope.go:117] "RemoveContainer" containerID="9ee54bf2aa7fa2d2788207107b556c20ed7c52eb424aeccd9ddf2b2c0d2f7eb9" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.572355 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.585581 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.596116 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.603576 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-httpd-run\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.603676 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.603731 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-scripts\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.603774 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-internal-tls-certs\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.603843 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-logs\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.603979 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlqmw\" (UniqueName: \"kubernetes.io/projected/2ee831ac-1cdb-4271-b7a3-054313f1d20b-kube-api-access-nlqmw\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.607547 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.608817 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-logs" (OuterVolumeSpecName: "logs") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.608934 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.613630 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-config-data\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.613730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-combined-ca-bundle\") pod \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\" (UID: \"2ee831ac-1cdb-4271-b7a3-054313f1d20b\") " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.614611 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.614630 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee831ac-1cdb-4271-b7a3-054313f1d20b-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.620927 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.623176 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-scripts" (OuterVolumeSpecName: "scripts") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.627830 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee831ac-1cdb-4271-b7a3-054313f1d20b-kube-api-access-nlqmw" (OuterVolumeSpecName: "kube-api-access-nlqmw") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "kube-api-access-nlqmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.658548 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.659244 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659259 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.659276 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659283 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.659301 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-api" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659308 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-api" Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.659319 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659328 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.659363 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-log" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659371 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-log" Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.659378 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-log" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659384 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-log" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659609 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659634 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659652 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-httpd" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659660 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" containerName="neutron-api" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659668 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" containerName="glance-log" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.659678 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" containerName="glance-log" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.660400 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.669940 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.670270 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.683030 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.691675 4735 scope.go:117] "RemoveContainer" containerID="0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.692732 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.720844 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlqmw\" (UniqueName: \"kubernetes.io/projected/2ee831ac-1cdb-4271-b7a3-054313f1d20b-kube-api-access-nlqmw\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.720898 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.720909 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.722781 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.724369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.724611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.737732 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.738013 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.770658 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.787622 4735 scope.go:117] "RemoveContainer" containerID="86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823260 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823329 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823356 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823377 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmgjq\" (UniqueName: \"kubernetes.io/projected/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-api-access-dmgjq\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823425 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823459 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-logs\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823492 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823516 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbkv6\" (UniqueName: \"kubernetes.io/projected/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-kube-api-access-tbkv6\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.823574 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.834843 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.837366 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.927860 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-config-data" (OuterVolumeSpecName: "config-data") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.935918 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ee831ac-1cdb-4271-b7a3-054313f1d20b" (UID: "2ee831ac-1cdb-4271-b7a3-054313f1d20b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937109 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmgjq\" (UniqueName: \"kubernetes.io/projected/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-api-access-dmgjq\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-logs\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbkv6\" (UniqueName: \"kubernetes.io/projected/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-kube-api-access-tbkv6\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.937438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.939996 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-logs\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.943939 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.943982 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-698444467b-bvdzb"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.948156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.948219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.948366 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.948383 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.948393 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee831ac-1cdb-4271-b7a3-054313f1d20b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.953290 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.953704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.953780 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-698444467b-bvdzb"] Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.958923 4735 scope.go:117] "RemoveContainer" containerID="0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.967311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.967509 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.969420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmgjq\" (UniqueName: \"kubernetes.io/projected/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-api-access-dmgjq\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.969443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.969453 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " pod="openstack/kube-state-metrics-0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.972571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:35 crc kubenswrapper[4735]: E1008 13:16:35.978723 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0\": container with ID starting with 0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0 not found: ID does not exist" containerID="0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.978792 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0"} err="failed to get container status \"0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0\": rpc error: code = NotFound desc = could not find container \"0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0\": container with ID starting with 0c8e3771261147ed06296c96e0fdece8db2d40b87b023020e5b7ce1f0cc1dca0 not found: ID does not exist" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.978827 4735 scope.go:117] "RemoveContainer" containerID="86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233" Oct 08 13:16:35 crc kubenswrapper[4735]: I1008 13:16:35.979886 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbkv6\" (UniqueName: \"kubernetes.io/projected/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-kube-api-access-tbkv6\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:35.999315 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: E1008 13:16:35.999488 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233\": container with ID starting with 86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233 not found: ID does not exist" containerID="86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:35.999523 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233"} err="failed to get container status \"86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233\": rpc error: code = NotFound desc = could not find container \"86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233\": container with ID starting with 86f7b8c9a6abce4931658ab4a5b12ea9d8b88db248e3e2300a16662f799db233 not found: ID does not exist" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.016156 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.060741 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-4rn4d"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.108838 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " pod="openstack/glance-default-external-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.149675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.184239 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.209813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerStarted","Data":"c0819f78ea091dd751dd408a292aa10571cf496ead726edef0c413d6d57c804c"} Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.213467 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee","Type":"ContainerStarted","Data":"bccb983981b1f26098aa949485a54cdd121dc8f152de2d2e5562f744c198bc4b"} Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.302951 4735 generic.go:334] "Generic (PLEG): container finished" podID="85383db5-f6cc-433b-b8d9-7b5397aeadd4" containerID="ec5c9443ed714cead8c92fc82a4368ded3008d6ad808a146f6e44d78449d3dce" exitCode=0 Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.303050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqm49" event={"ID":"85383db5-f6cc-433b-b8d9-7b5397aeadd4","Type":"ContainerDied","Data":"ec5c9443ed714cead8c92fc82a4368ded3008d6ad808a146f6e44d78449d3dce"} Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.334328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1670833a-54b6-4088-bcf1-db1daecd4fb4","Type":"ContainerStarted","Data":"3345e5386e2a0e69fb51f10003ace24cdc8229d7a948c9822abdc39ae7c186ab"} Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.347489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee831ac-1cdb-4271-b7a3-054313f1d20b","Type":"ContainerDied","Data":"e0d263fc6c060d2f656a4bc3db832b593de8e7b3f636cea6a05c47b50de3814f"} Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.347551 4735 scope.go:117] "RemoveContainer" containerID="195a6ffbbaaa312e7578490fa0e19d1ab01f295407b49eb14d400842de28a9ad" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.347720 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.377645 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" event={"ID":"6819ca75-acc5-4a08-93d0-c58685908257","Type":"ContainerStarted","Data":"02297bd6a3ddb02680c567ce4c77f24e12e80b85d8400e43bcbf7c8d5591a531"} Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.387941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.465698 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.489052 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.519575 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.522330 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.529681 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.529877 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.533957 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603892 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22vkt\" (UniqueName: \"kubernetes.io/projected/b8c473e5-0132-4ee7-a14e-ada0780d358c-kube-api-access-22vkt\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.603963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.705796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.705859 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.705886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.705951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.706045 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.706064 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22vkt\" (UniqueName: \"kubernetes.io/projected/b8c473e5-0132-4ee7-a14e-ada0780d358c-kube-api-access-22vkt\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.706083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.706102 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.706658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.708118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.714500 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.715787 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.716370 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.729312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.736786 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22vkt\" (UniqueName: \"kubernetes.io/projected/b8c473e5-0132-4ee7-a14e-ada0780d358c-kube-api-access-22vkt\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.770036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.805736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.880087 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.940834 4735 scope.go:117] "RemoveContainer" containerID="5bf07e71f83ec42daf31838c9792cb61bff2c3545d23d5b759abc92cdc49cd03" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.948284 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee831ac-1cdb-4271-b7a3-054313f1d20b" path="/var/lib/kubelet/pods/2ee831ac-1cdb-4271-b7a3-054313f1d20b/volumes" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.948897 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add5914b-96e1-4d52-8648-d3429ea33b0d" path="/var/lib/kubelet/pods/add5914b-96e1-4d52-8648-d3429ea33b0d/volumes" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.949533 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8d6fa0-e75f-40fe-92e9-284df31eae95" path="/var/lib/kubelet/pods/af8d6fa0-e75f-40fe-92e9-284df31eae95/volumes" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.950979 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daeb6ee4-aafc-4251-9512-43c8cde46c02" path="/var/lib/kubelet/pods/daeb6ee4-aafc-4251-9512-43c8cde46c02/volumes" Oct 08 13:16:36 crc kubenswrapper[4735]: I1008 13:16:36.951726 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.111299 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.154707 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.180291 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.221511 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lgf\" (UniqueName: \"kubernetes.io/projected/85383db5-f6cc-433b-b8d9-7b5397aeadd4-kube-api-access-w8lgf\") pod \"85383db5-f6cc-433b-b8d9-7b5397aeadd4\" (UID: \"85383db5-f6cc-433b-b8d9-7b5397aeadd4\") " Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.231122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85383db5-f6cc-433b-b8d9-7b5397aeadd4-kube-api-access-w8lgf" (OuterVolumeSpecName: "kube-api-access-w8lgf") pod "85383db5-f6cc-433b-b8d9-7b5397aeadd4" (UID: "85383db5-f6cc-433b-b8d9-7b5397aeadd4"). InnerVolumeSpecName "kube-api-access-w8lgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.324217 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9c2r\" (UniqueName: \"kubernetes.io/projected/b5d95702-991f-4f45-94e3-45650c900b1a-kube-api-access-h9c2r\") pod \"b5d95702-991f-4f45-94e3-45650c900b1a\" (UID: \"b5d95702-991f-4f45-94e3-45650c900b1a\") " Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.330869 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d95702-991f-4f45-94e3-45650c900b1a-kube-api-access-h9c2r" (OuterVolumeSpecName: "kube-api-access-h9c2r") pod "b5d95702-991f-4f45-94e3-45650c900b1a" (UID: "b5d95702-991f-4f45-94e3-45650c900b1a"). InnerVolumeSpecName "kube-api-access-h9c2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.333224 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lgf\" (UniqueName: \"kubernetes.io/projected/85383db5-f6cc-433b-b8d9-7b5397aeadd4-kube-api-access-w8lgf\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.333253 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9c2r\" (UniqueName: \"kubernetes.io/projected/b5d95702-991f-4f45-94e3-45650c900b1a-kube-api-access-h9c2r\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.459791 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.481221 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9h9hg" event={"ID":"b5d95702-991f-4f45-94e3-45650c900b1a","Type":"ContainerDied","Data":"fe695692fc60a4d01063aaec8110f1596cf1dc4c08afd1eff56975e64f23d430"} Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.481266 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe695692fc60a4d01063aaec8110f1596cf1dc4c08afd1eff56975e64f23d430" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.481335 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9h9hg" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.508307 4735 generic.go:334] "Generic (PLEG): container finished" podID="6819ca75-acc5-4a08-93d0-c58685908257" containerID="10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f" exitCode=0 Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.508368 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" event={"ID":"6819ca75-acc5-4a08-93d0-c58685908257","Type":"ContainerDied","Data":"10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f"} Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.514115 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqm49" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.514115 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqm49" event={"ID":"85383db5-f6cc-433b-b8d9-7b5397aeadd4","Type":"ContainerDied","Data":"d43d2eca9e2d5078166ef4561d8ac155c46687584345cade59878d27243a3812"} Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.514188 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43d2eca9e2d5078166ef4561d8ac155c46687584345cade59878d27243a3812" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.538508 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5hhc\" (UniqueName: \"kubernetes.io/projected/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1-kube-api-access-j5hhc\") pod \"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1\" (UID: \"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1\") " Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.545766 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1-kube-api-access-j5hhc" (OuterVolumeSpecName: "kube-api-access-j5hhc") pod "e203cdd5-6f7f-4e2e-9be1-3d467c6005f1" (UID: "e203cdd5-6f7f-4e2e-9be1-3d467c6005f1"). InnerVolumeSpecName "kube-api-access-j5hhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.547147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-49lbk" event={"ID":"e203cdd5-6f7f-4e2e-9be1-3d467c6005f1","Type":"ContainerDied","Data":"fa3be51aaaf46b6750350bb7c0e6d7625cc9ff357eb51fd7a75eeeea7f3fb3a6"} Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.547184 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3be51aaaf46b6750350bb7c0e6d7625cc9ff357eb51fd7a75eeeea7f3fb3a6" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.547253 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49lbk" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.556253 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a","Type":"ContainerStarted","Data":"8f8229032075f44076ad9bdb292a5dd945022eff685ef4b721d0ec3c86908f4e"} Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.643591 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5hhc\" (UniqueName: \"kubernetes.io/projected/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1-kube-api-access-j5hhc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.784083 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:16:37 crc kubenswrapper[4735]: W1008 13:16:37.802459 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8c473e5_0132_4ee7_a14e_ada0780d358c.slice/crio-beb05430b2e4377c90fcb24d6c1b3d571a3fe48e638ced052b15421e516051f6 WatchSource:0}: Error finding container beb05430b2e4377c90fcb24d6c1b3d571a3fe48e638ced052b15421e516051f6: Status 404 returned error can't find the container with id beb05430b2e4377c90fcb24d6c1b3d571a3fe48e638ced052b15421e516051f6 Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.923315 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:37 crc kubenswrapper[4735]: I1008 13:16:37.989883 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.520825 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="add5914b-96e1-4d52-8648-d3429ea33b0d" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.608357 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f","Type":"ContainerStarted","Data":"9b9000366a32ba84dbf78ae5b17fcafb2500b541ed982f02d42d0094ea6b4c80"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.615398 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a","Type":"ContainerStarted","Data":"ff75cfda3a642f58adcba7bc55060219ea312fa63294148521fb3e2d8da99ab9"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.616756 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.620364 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8c473e5-0132-4ee7-a14e-ada0780d358c","Type":"ContainerStarted","Data":"beb05430b2e4377c90fcb24d6c1b3d571a3fe48e638ced052b15421e516051f6"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.629144 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerStarted","Data":"46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.635188 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.07944694 podStartE2EDuration="3.635164461s" podCreationTimestamp="2025-10-08 13:16:35 +0000 UTC" firstStartedPulling="2025-10-08 13:16:37.349515275 +0000 UTC m=+1247.157819819" lastFinishedPulling="2025-10-08 13:16:37.905232796 +0000 UTC m=+1247.713537340" observedRunningTime="2025-10-08 13:16:38.632355978 +0000 UTC m=+1248.440660532" watchObservedRunningTime="2025-10-08 13:16:38.635164461 +0000 UTC m=+1248.443469015" Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.643604 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee","Type":"ContainerStarted","Data":"7d4e033e43405a1bc131fb9c3261d4e3e9c5f3c78368201ede6fed02dadd31d6"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.653896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1670833a-54b6-4088-bcf1-db1daecd4fb4","Type":"ContainerStarted","Data":"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.668303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" event={"ID":"6819ca75-acc5-4a08-93d0-c58685908257","Type":"ContainerStarted","Data":"fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e"} Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.669320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:38 crc kubenswrapper[4735]: I1008 13:16:38.698508 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" podStartSLOduration=5.698469432 podStartE2EDuration="5.698469432s" podCreationTimestamp="2025-10-08 13:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:38.690833054 +0000 UTC m=+1248.499137608" watchObservedRunningTime="2025-10-08 13:16:38.698469432 +0000 UTC m=+1248.506773976" Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.066347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.089712 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.696094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerStarted","Data":"906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97"} Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.707515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee","Type":"ContainerStarted","Data":"87812ca35965fb0da7ea953b973156c50579497c00f0665f29deb9214f637fa9"} Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.717740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1670833a-54b6-4088-bcf1-db1daecd4fb4","Type":"ContainerStarted","Data":"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e"} Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.717880 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api-log" containerID="cri-o://a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d" gracePeriod=30 Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.718087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.718124 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api" containerID="cri-o://eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e" gracePeriod=30 Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.731725 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f","Type":"ContainerStarted","Data":"00bb3e3d5baf3ca592174014e03b36040aa0b638cb8a1c2566220b5a3cb65e9b"} Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.737855 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.2703647270000005 podStartE2EDuration="6.737836114s" podCreationTimestamp="2025-10-08 13:16:33 +0000 UTC" firstStartedPulling="2025-10-08 13:16:35.589324193 +0000 UTC m=+1245.397628737" lastFinishedPulling="2025-10-08 13:16:37.05679558 +0000 UTC m=+1246.865100124" observedRunningTime="2025-10-08 13:16:39.731061019 +0000 UTC m=+1249.539365573" watchObservedRunningTime="2025-10-08 13:16:39.737836114 +0000 UTC m=+1249.546140668" Oct 08 13:16:39 crc kubenswrapper[4735]: I1008 13:16:39.784520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8c473e5-0132-4ee7-a14e-ada0780d358c","Type":"ContainerStarted","Data":"59fafd30b9822a732b940ca8ee5e9d58f1e7139b22b56b4a8c5b227e38ab23a0"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.491150 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-combined-ca-bundle\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1670833a-54b6-4088-bcf1-db1daecd4fb4-logs\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-scripts\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546433 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data-custom\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1670833a-54b6-4088-bcf1-db1daecd4fb4-etc-machine-id\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7ww8\" (UniqueName: \"kubernetes.io/projected/1670833a-54b6-4088-bcf1-db1daecd4fb4-kube-api-access-b7ww8\") pod \"1670833a-54b6-4088-bcf1-db1daecd4fb4\" (UID: \"1670833a-54b6-4088-bcf1-db1daecd4fb4\") " Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546655 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1670833a-54b6-4088-bcf1-db1daecd4fb4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.546846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1670833a-54b6-4088-bcf1-db1daecd4fb4-logs" (OuterVolumeSpecName: "logs") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.547411 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1670833a-54b6-4088-bcf1-db1daecd4fb4-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.547438 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1670833a-54b6-4088-bcf1-db1daecd4fb4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.555771 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-scripts" (OuterVolumeSpecName: "scripts") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.568531 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1670833a-54b6-4088-bcf1-db1daecd4fb4-kube-api-access-b7ww8" (OuterVolumeSpecName: "kube-api-access-b7ww8") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "kube-api-access-b7ww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.574462 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.619614 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.648653 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.648684 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.648697 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.648714 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7ww8\" (UniqueName: \"kubernetes.io/projected/1670833a-54b6-4088-bcf1-db1daecd4fb4-kube-api-access-b7ww8\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.677635 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data" (OuterVolumeSpecName: "config-data") pod "1670833a-54b6-4088-bcf1-db1daecd4fb4" (UID: "1670833a-54b6-4088-bcf1-db1daecd4fb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.750111 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670833a-54b6-4088-bcf1-db1daecd4fb4-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.793513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f","Type":"ContainerStarted","Data":"7cbf4814a5d8742756a2f0076ccbdc83fc3186c93b8940d1bc61f10728f5e789"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.795114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8c473e5-0132-4ee7-a14e-ada0780d358c","Type":"ContainerStarted","Data":"89675ae013264bf40ddb13e38257d27a7ce994542ac6e288c014f3d00f9d0b13"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.797315 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerStarted","Data":"11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.798766 4735 generic.go:334] "Generic (PLEG): container finished" podID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerID="eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e" exitCode=0 Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.798799 4735 generic.go:334] "Generic (PLEG): container finished" podID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerID="a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d" exitCode=143 Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.798926 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.798931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1670833a-54b6-4088-bcf1-db1daecd4fb4","Type":"ContainerDied","Data":"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.799082 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1670833a-54b6-4088-bcf1-db1daecd4fb4","Type":"ContainerDied","Data":"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.799136 4735 scope.go:117] "RemoveContainer" containerID="eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.799150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1670833a-54b6-4088-bcf1-db1daecd4fb4","Type":"ContainerDied","Data":"3345e5386e2a0e69fb51f10003ace24cdc8229d7a948c9822abdc39ae7c186ab"} Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.830199 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.830175891 podStartE2EDuration="5.830175891s" podCreationTimestamp="2025-10-08 13:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:40.815152242 +0000 UTC m=+1250.623456806" watchObservedRunningTime="2025-10-08 13:16:40.830175891 +0000 UTC m=+1250.638480445" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.831870 4735 scope.go:117] "RemoveContainer" containerID="a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.867556 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.867525889 podStartE2EDuration="4.867525889s" podCreationTimestamp="2025-10-08 13:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:40.858074994 +0000 UTC m=+1250.666379558" watchObservedRunningTime="2025-10-08 13:16:40.867525889 +0000 UTC m=+1250.675830443" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.871023 4735 scope.go:117] "RemoveContainer" containerID="eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e" Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.871770 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e\": container with ID starting with eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e not found: ID does not exist" containerID="eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.871835 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e"} err="failed to get container status \"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e\": rpc error: code = NotFound desc = could not find container \"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e\": container with ID starting with eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e not found: ID does not exist" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.871862 4735 scope.go:117] "RemoveContainer" containerID="a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d" Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.872113 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d\": container with ID starting with a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d not found: ID does not exist" containerID="a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.872135 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d"} err="failed to get container status \"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d\": rpc error: code = NotFound desc = could not find container \"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d\": container with ID starting with a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d not found: ID does not exist" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.872162 4735 scope.go:117] "RemoveContainer" containerID="eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.872589 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e"} err="failed to get container status \"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e\": rpc error: code = NotFound desc = could not find container \"eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e\": container with ID starting with eab80948551a349a49cbd0aed190e9b5728a28e43319fe2d0dc13b6d0f645a7e not found: ID does not exist" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.872625 4735 scope.go:117] "RemoveContainer" containerID="a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.873185 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d"} err="failed to get container status \"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d\": rpc error: code = NotFound desc = could not find container \"a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d\": container with ID starting with a010040bd0e6262af9aabd39806c1f2f13dedc65eeecea6ef3f219ff4c3e054d not found: ID does not exist" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.948546 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.957741 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.972544 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.972975 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.972994 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api" Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.973009 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api-log" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973016 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api-log" Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.973029 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e203cdd5-6f7f-4e2e-9be1-3d467c6005f1" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973035 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e203cdd5-6f7f-4e2e-9be1-3d467c6005f1" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.973048 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d95702-991f-4f45-94e3-45650c900b1a" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973054 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d95702-991f-4f45-94e3-45650c900b1a" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: E1008 13:16:40.973072 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85383db5-f6cc-433b-b8d9-7b5397aeadd4" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973077 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85383db5-f6cc-433b-b8d9-7b5397aeadd4" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973249 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973262 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" containerName="cinder-api-log" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973279 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="85383db5-f6cc-433b-b8d9-7b5397aeadd4" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973290 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d95702-991f-4f45-94e3-45650c900b1a" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.973300 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e203cdd5-6f7f-4e2e-9be1-3d467c6005f1" containerName="mariadb-database-create" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.974272 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.974349 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.976846 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.977157 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 13:16:40 crc kubenswrapper[4735]: I1008 13:16:40.982079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.066659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-scripts\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.066720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.066897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.066966 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-logs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.067347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.067397 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.067464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.067579 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.067691 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qjl\" (UniqueName: \"kubernetes.io/projected/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-kube-api-access-m5qjl\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qjl\" (UniqueName: \"kubernetes.io/projected/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-kube-api-access-m5qjl\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-scripts\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169485 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-logs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169662 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.169747 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.170123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.170406 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-logs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.175109 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.177807 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.178268 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.179984 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.181204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-scripts\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.194499 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.198357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qjl\" (UniqueName: \"kubernetes.io/projected/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-kube-api-access-m5qjl\") pod \"cinder-api-0\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.302846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:16:41 crc kubenswrapper[4735]: I1008 13:16:41.833560 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:16:41 crc kubenswrapper[4735]: W1008 13:16:41.841686 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bc218ce_b4bf_4477_afaa_ff9e78a7c1d2.slice/crio-6b73e013ac9ec7f447b8c835db5a8692a5faf8e441ae811092f34d1b4454c24f WatchSource:0}: Error finding container 6b73e013ac9ec7f447b8c835db5a8692a5faf8e441ae811092f34d1b4454c24f: Status 404 returned error can't find the container with id 6b73e013ac9ec7f447b8c835db5a8692a5faf8e441ae811092f34d1b4454c24f Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.848659 4735 generic.go:334] "Generic (PLEG): container finished" podID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerID="e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7" exitCode=1 Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.849259 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerDied","Data":"e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7"} Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.849407 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-central-agent" containerID="cri-o://46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad" gracePeriod=30 Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.849696 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="sg-core" containerID="cri-o://11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8" gracePeriod=30 Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.849749 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-notification-agent" containerID="cri-o://906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97" gracePeriod=30 Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.852170 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2","Type":"ContainerStarted","Data":"ef4967d1505370e0cd47289c67232c6f8769a456bdabc3b30bd020b8c2020a1e"} Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.852196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2","Type":"ContainerStarted","Data":"6b73e013ac9ec7f447b8c835db5a8692a5faf8e441ae811092f34d1b4454c24f"} Oct 08 13:16:42 crc kubenswrapper[4735]: I1008 13:16:42.908138 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1670833a-54b6-4088-bcf1-db1daecd4fb4" path="/var/lib/kubelet/pods/1670833a-54b6-4088-bcf1-db1daecd4fb4/volumes" Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.864201 4735 generic.go:334] "Generic (PLEG): container finished" podID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerID="11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8" exitCode=2 Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.864514 4735 generic.go:334] "Generic (PLEG): container finished" podID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerID="906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97" exitCode=0 Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.864567 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerDied","Data":"11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8"} Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.864599 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerDied","Data":"906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97"} Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.866166 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2","Type":"ContainerStarted","Data":"52bb4221516a6482adaee2563d79681bb7ba3599c014219c89382ac18144524b"} Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.867711 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 13:16:43 crc kubenswrapper[4735]: I1008 13:16:43.900109 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.900086563 podStartE2EDuration="3.900086563s" podCreationTimestamp="2025-10-08 13:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:43.885464714 +0000 UTC m=+1253.693769258" watchObservedRunningTime="2025-10-08 13:16:43.900086563 +0000 UTC m=+1253.708391107" Oct 08 13:16:44 crc kubenswrapper[4735]: I1008 13:16:44.671551 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 13:16:44 crc kubenswrapper[4735]: I1008 13:16:44.820736 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:16:44 crc kubenswrapper[4735]: I1008 13:16:44.883736 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 13:16:44 crc kubenswrapper[4735]: I1008 13:16:44.942668 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-zjd4g"] Oct 08 13:16:44 crc kubenswrapper[4735]: I1008 13:16:44.942995 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" podUID="28f754cf-0740-4da5-bec9-3e238e248795" containerName="dnsmasq-dns" containerID="cri-o://028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256" gracePeriod=10 Oct 08 13:16:44 crc kubenswrapper[4735]: I1008 13:16:44.978651 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.356845 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.392914 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-log-httpd\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.393023 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-combined-ca-bundle\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.393080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-scripts\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.393135 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-sg-core-conf-yaml\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.393188 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-config-data\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.393220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-run-httpd\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.393246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8wbw\" (UniqueName: \"kubernetes.io/projected/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-kube-api-access-l8wbw\") pod \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\" (UID: \"b56f0fbb-2ce4-46f1-b766-a793f8ac780a\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.394175 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.394618 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.412195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-kube-api-access-l8wbw" (OuterVolumeSpecName: "kube-api-access-l8wbw") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "kube-api-access-l8wbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.412439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-scripts" (OuterVolumeSpecName: "scripts") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.440659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.459970 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.494521 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-nb\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.494923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-swift-storage-0\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.495204 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-sb\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.495287 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.495391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4p22\" (UniqueName: \"kubernetes.io/projected/28f754cf-0740-4da5-bec9-3e238e248795-kube-api-access-g4p22\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.495450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-config\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.496653 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.496697 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.496712 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.496724 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.496736 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8wbw\" (UniqueName: \"kubernetes.io/projected/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-kube-api-access-l8wbw\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.501862 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f754cf-0740-4da5-bec9-3e238e248795-kube-api-access-g4p22" (OuterVolumeSpecName: "kube-api-access-g4p22") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "kube-api-access-g4p22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.533115 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-config-data" (OuterVolumeSpecName: "config-data") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.542918 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b56f0fbb-2ce4-46f1-b766-a793f8ac780a" (UID: "b56f0fbb-2ce4-46f1-b766-a793f8ac780a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.551134 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.570767 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.572261 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.596859 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.597429 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc\") pod \"28f754cf-0740-4da5-bec9-3e238e248795\" (UID: \"28f754cf-0740-4da5-bec9-3e238e248795\") " Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598088 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598115 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598129 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4p22\" (UniqueName: \"kubernetes.io/projected/28f754cf-0740-4da5-bec9-3e238e248795-kube-api-access-g4p22\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598143 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: W1008 13:16:45.598148 4735 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/28f754cf-0740-4da5-bec9-3e238e248795/volumes/kubernetes.io~configmap/dns-svc Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598155 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598160 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.598171 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56f0fbb-2ce4-46f1-b766-a793f8ac780a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.602969 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-config" (OuterVolumeSpecName: "config") pod "28f754cf-0740-4da5-bec9-3e238e248795" (UID: "28f754cf-0740-4da5-bec9-3e238e248795"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.700047 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.700097 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f754cf-0740-4da5-bec9-3e238e248795-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.905143 4735 generic.go:334] "Generic (PLEG): container finished" podID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerID="46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad" exitCode=0 Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.905230 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.905242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerDied","Data":"46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad"} Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.905272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b56f0fbb-2ce4-46f1-b766-a793f8ac780a","Type":"ContainerDied","Data":"c0819f78ea091dd751dd408a292aa10571cf496ead726edef0c413d6d57c804c"} Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.905291 4735 scope.go:117] "RemoveContainer" containerID="e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.924836 4735 generic.go:334] "Generic (PLEG): container finished" podID="28f754cf-0740-4da5-bec9-3e238e248795" containerID="028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256" exitCode=0 Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.924902 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.924906 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" event={"ID":"28f754cf-0740-4da5-bec9-3e238e248795","Type":"ContainerDied","Data":"028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256"} Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.925055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-zjd4g" event={"ID":"28f754cf-0740-4da5-bec9-3e238e248795","Type":"ContainerDied","Data":"b59b543267d73d9e9ff69aecdae0c761eaefee8c907420700bd5324b04dc9e55"} Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.925928 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="cinder-scheduler" containerID="cri-o://7d4e033e43405a1bc131fb9c3261d4e3e9c5f3c78368201ede6fed02dadd31d6" gracePeriod=30 Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.925990 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="probe" containerID="cri-o://87812ca35965fb0da7ea953b973156c50579497c00f0665f29deb9214f637fa9" gracePeriod=30 Oct 08 13:16:45 crc kubenswrapper[4735]: I1008 13:16:45.995012 4735 scope.go:117] "RemoveContainer" containerID="11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.034412 4735 scope.go:117] "RemoveContainer" containerID="906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.037673 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.040219 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-zjd4g"] Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.047927 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-zjd4g"] Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.073752 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.079620 4735 scope.go:117] "RemoveContainer" containerID="46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.092498 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099295 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.099713 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f754cf-0740-4da5-bec9-3e238e248795" containerName="dnsmasq-dns" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099729 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f754cf-0740-4da5-bec9-3e238e248795" containerName="dnsmasq-dns" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.099743 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="proxy-httpd" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099749 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="proxy-httpd" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.099766 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-central-agent" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099774 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-central-agent" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.099792 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f754cf-0740-4da5-bec9-3e238e248795" containerName="init" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099798 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f754cf-0740-4da5-bec9-3e238e248795" containerName="init" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.099809 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="sg-core" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099815 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="sg-core" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.099828 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-notification-agent" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.099833 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-notification-agent" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.100020 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-notification-agent" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.100031 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="proxy-httpd" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.100045 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="ceilometer-central-agent" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.100053 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" containerName="sg-core" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.100070 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f754cf-0740-4da5-bec9-3e238e248795" containerName="dnsmasq-dns" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.101652 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.105387 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.105509 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.106222 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.119135 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.126492 4735 scope.go:117] "RemoveContainer" containerID="e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.127068 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7\": container with ID starting with e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7 not found: ID does not exist" containerID="e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.127113 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7"} err="failed to get container status \"e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7\": rpc error: code = NotFound desc = could not find container \"e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7\": container with ID starting with e769d1a559404c20cda2c24c922851eb71be2f8adb033c51dee877fa3af16ff7 not found: ID does not exist" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.127140 4735 scope.go:117] "RemoveContainer" containerID="11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.127444 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8\": container with ID starting with 11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8 not found: ID does not exist" containerID="11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.127491 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8"} err="failed to get container status \"11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8\": rpc error: code = NotFound desc = could not find container \"11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8\": container with ID starting with 11f2c5eaf6c7c6de637ab2b70e9a93f6f8393fe808469b003366f42193f5aba8 not found: ID does not exist" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.127513 4735 scope.go:117] "RemoveContainer" containerID="906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.128710 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97\": container with ID starting with 906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97 not found: ID does not exist" containerID="906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.128858 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97"} err="failed to get container status \"906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97\": rpc error: code = NotFound desc = could not find container \"906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97\": container with ID starting with 906d22ac69a7ed72995e737aae6e7cd79db2ad54e803f8dabade308bbde7bd97 not found: ID does not exist" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.128890 4735 scope.go:117] "RemoveContainer" containerID="46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.129253 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad\": container with ID starting with 46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad not found: ID does not exist" containerID="46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.129295 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad"} err="failed to get container status \"46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad\": rpc error: code = NotFound desc = could not find container \"46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad\": container with ID starting with 46b3a7cda4234d5416801e870faf084fcc3384dc0690f72159d1b11033da14ad not found: ID does not exist" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.129309 4735 scope.go:117] "RemoveContainer" containerID="028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.170861 4735 scope.go:117] "RemoveContainer" containerID="f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.195793 4735 scope.go:117] "RemoveContainer" containerID="028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.198446 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256\": container with ID starting with 028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256 not found: ID does not exist" containerID="028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.198505 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256"} err="failed to get container status \"028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256\": rpc error: code = NotFound desc = could not find container \"028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256\": container with ID starting with 028118fa7025989d5931676c283baf519dd6471cda2508c96df5cd4fc5d1b256 not found: ID does not exist" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.198534 4735 scope.go:117] "RemoveContainer" containerID="f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8" Oct 08 13:16:46 crc kubenswrapper[4735]: E1008 13:16:46.199991 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8\": container with ID starting with f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8 not found: ID does not exist" containerID="f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.200057 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8"} err="failed to get container status \"f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8\": rpc error: code = NotFound desc = could not find container \"f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8\": container with ID starting with f98cc139ed1e0eae464b412deb3f650748a9d91223711fd0c2ffa1a03a5ba1f8 not found: ID does not exist" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.210313 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-config-data\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.211931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mplbv\" (UniqueName: \"kubernetes.io/projected/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-kube-api-access-mplbv\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.228577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.229984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.230271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-scripts\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.232423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.233580 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-log-httpd\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.233950 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-run-httpd\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336208 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-config-data\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mplbv\" (UniqueName: \"kubernetes.io/projected/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-kube-api-access-mplbv\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336318 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336359 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-scripts\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.336571 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-log-httpd\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.337083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-run-httpd\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.337463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-log-httpd\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.337510 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-run-httpd\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.340712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.340927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-scripts\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.341189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.341513 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-config-data\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.350424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.356981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mplbv\" (UniqueName: \"kubernetes.io/projected/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-kube-api-access-mplbv\") pod \"ceilometer-0\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.389258 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.389304 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.421008 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.430365 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.436576 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.881464 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.881858 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.883272 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.914908 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f754cf-0740-4da5-bec9-3e238e248795" path="/var/lib/kubelet/pods/28f754cf-0740-4da5-bec9-3e238e248795/volumes" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.916302 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56f0fbb-2ce4-46f1-b766-a793f8ac780a" path="/var/lib/kubelet/pods/b56f0fbb-2ce4-46f1-b766-a793f8ac780a/volumes" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.919594 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.931692 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.944157 4735 generic.go:334] "Generic (PLEG): container finished" podID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerID="87812ca35965fb0da7ea953b973156c50579497c00f0665f29deb9214f637fa9" exitCode=0 Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.944260 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee","Type":"ContainerDied","Data":"87812ca35965fb0da7ea953b973156c50579497c00f0665f29deb9214f637fa9"} Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.948948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerStarted","Data":"9d5a538af64959a7641301ace964597e6f1bf36bca4441b300b65f39f4b4c6eb"} Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.954351 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.954385 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.954395 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 13:16:46 crc kubenswrapper[4735]: I1008 13:16:46.954402 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:47 crc kubenswrapper[4735]: I1008 13:16:47.056985 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:16:47 crc kubenswrapper[4735]: I1008 13:16:47.966128 4735 generic.go:334] "Generic (PLEG): container finished" podID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerID="7d4e033e43405a1bc131fb9c3261d4e3e9c5f3c78368201ede6fed02dadd31d6" exitCode=0 Oct 08 13:16:47 crc kubenswrapper[4735]: I1008 13:16:47.967030 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee","Type":"ContainerDied","Data":"7d4e033e43405a1bc131fb9c3261d4e3e9c5f3c78368201ede6fed02dadd31d6"} Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.501032 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.684294 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data\") pod \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.684377 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx7lw\" (UniqueName: \"kubernetes.io/projected/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-kube-api-access-xx7lw\") pod \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.684396 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-scripts\") pod \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.684421 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data-custom\") pod \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.684599 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-combined-ca-bundle\") pod \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.684651 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-etc-machine-id\") pod \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\" (UID: \"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee\") " Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.685374 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" (UID: "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.690293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" (UID: "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.690503 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-kube-api-access-xx7lw" (OuterVolumeSpecName: "kube-api-access-xx7lw") pod "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" (UID: "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee"). InnerVolumeSpecName "kube-api-access-xx7lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.692572 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-scripts" (OuterVolumeSpecName: "scripts") pod "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" (UID: "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.739755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" (UID: "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.787106 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.787149 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.787164 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx7lw\" (UniqueName: \"kubernetes.io/projected/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-kube-api-access-xx7lw\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.787178 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.787190 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.796601 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data" (OuterVolumeSpecName: "config-data") pod "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" (UID: "8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.888355 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.977274 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee","Type":"ContainerDied","Data":"bccb983981b1f26098aa949485a54cdd121dc8f152de2d2e5562f744c198bc4b"} Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.977325 4735 scope.go:117] "RemoveContainer" containerID="87812ca35965fb0da7ea953b973156c50579497c00f0665f29deb9214f637fa9" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.977451 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:16:48 crc kubenswrapper[4735]: I1008 13:16:48.981083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerStarted","Data":"03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86"} Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.003077 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.014426 4735 scope.go:117] "RemoveContainer" containerID="7d4e033e43405a1bc131fb9c3261d4e3e9c5f3c78368201ede6fed02dadd31d6" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.018127 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.034208 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:49 crc kubenswrapper[4735]: E1008 13:16:49.035226 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="probe" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.035250 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="probe" Oct 08 13:16:49 crc kubenswrapper[4735]: E1008 13:16:49.035270 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="cinder-scheduler" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.035279 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="cinder-scheduler" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.035530 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="probe" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.035572 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" containerName="cinder-scheduler" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.036707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.049752 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.090170 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.198593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.198635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.198838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-scripts\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.198892 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.198991 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d27nt\" (UniqueName: \"kubernetes.io/projected/22f4e804-ab3e-4858-87b4-84a310c21249-kube-api-access-d27nt\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.199084 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22f4e804-ab3e-4858-87b4-84a310c21249-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.290231 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.290337 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.290825 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.300287 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-scripts\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.300334 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.300380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d27nt\" (UniqueName: \"kubernetes.io/projected/22f4e804-ab3e-4858-87b4-84a310c21249-kube-api-access-d27nt\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.300419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22f4e804-ab3e-4858-87b4-84a310c21249-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.300545 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.300566 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.303935 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22f4e804-ab3e-4858-87b4-84a310c21249-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.307705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.308061 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.310347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.320037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-scripts\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.348155 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d27nt\" (UniqueName: \"kubernetes.io/projected/22f4e804-ab3e-4858-87b4-84a310c21249-kube-api-access-d27nt\") pod \"cinder-scheduler-0\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.365161 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.714890 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.715468 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.716815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 13:16:49 crc kubenswrapper[4735]: I1008 13:16:49.896450 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.013602 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22f4e804-ab3e-4858-87b4-84a310c21249","Type":"ContainerStarted","Data":"9bd4613a433c56bf34cefa6d3b8d95de96342f543309ed08011b6cc7148e8c82"} Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.022274 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerStarted","Data":"20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952"} Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.800953 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-86a1-account-create-xgf9d"] Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.802327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.806600 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.814634 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-86a1-account-create-xgf9d"] Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.948017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mq2\" (UniqueName: \"kubernetes.io/projected/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda-kube-api-access-59mq2\") pod \"nova-api-86a1-account-create-xgf9d\" (UID: \"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda\") " pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:50 crc kubenswrapper[4735]: I1008 13:16:50.954351 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee" path="/var/lib/kubelet/pods/8093aedd-e8c4-4d72-9d5f-b8e9fb2595ee/volumes" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.034192 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-66dd-account-create-4q8hl"] Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.035255 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.037939 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.050460 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mq2\" (UniqueName: \"kubernetes.io/projected/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda-kube-api-access-59mq2\") pod \"nova-api-86a1-account-create-xgf9d\" (UID: \"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda\") " pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.065014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22f4e804-ab3e-4858-87b4-84a310c21249","Type":"ContainerStarted","Data":"f976efe16a474e89b33e73cf876bd99fd72a4f29125be28a2094105456065a00"} Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.096578 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-66dd-account-create-4q8hl"] Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.098611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mq2\" (UniqueName: \"kubernetes.io/projected/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda-kube-api-access-59mq2\") pod \"nova-api-86a1-account-create-xgf9d\" (UID: \"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda\") " pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.152606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmbj\" (UniqueName: \"kubernetes.io/projected/72461694-86b9-4133-9970-302416099748-kube-api-access-7jmbj\") pod \"nova-cell0-66dd-account-create-4q8hl\" (UID: \"72461694-86b9-4133-9970-302416099748\") " pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.191000 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-726a-account-create-5szhf"] Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.197224 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.202345 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.206556 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-726a-account-create-5szhf"] Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.255735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmbj\" (UniqueName: \"kubernetes.io/projected/72461694-86b9-4133-9970-302416099748-kube-api-access-7jmbj\") pod \"nova-cell0-66dd-account-create-4q8hl\" (UID: \"72461694-86b9-4133-9970-302416099748\") " pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.255889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spjnj\" (UniqueName: \"kubernetes.io/projected/b452547c-5bac-4df5-a0e6-d969416877fc-kube-api-access-spjnj\") pod \"nova-cell1-726a-account-create-5szhf\" (UID: \"b452547c-5bac-4df5-a0e6-d969416877fc\") " pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.289029 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.291833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmbj\" (UniqueName: \"kubernetes.io/projected/72461694-86b9-4133-9970-302416099748-kube-api-access-7jmbj\") pod \"nova-cell0-66dd-account-create-4q8hl\" (UID: \"72461694-86b9-4133-9970-302416099748\") " pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.358613 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spjnj\" (UniqueName: \"kubernetes.io/projected/b452547c-5bac-4df5-a0e6-d969416877fc-kube-api-access-spjnj\") pod \"nova-cell1-726a-account-create-5szhf\" (UID: \"b452547c-5bac-4df5-a0e6-d969416877fc\") " pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.384950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spjnj\" (UniqueName: \"kubernetes.io/projected/b452547c-5bac-4df5-a0e6-d969416877fc-kube-api-access-spjnj\") pod \"nova-cell1-726a-account-create-5szhf\" (UID: \"b452547c-5bac-4df5-a0e6-d969416877fc\") " pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.399117 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:51 crc kubenswrapper[4735]: I1008 13:16:51.545511 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.003851 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-86a1-account-create-xgf9d"] Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.050756 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.098640 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22f4e804-ab3e-4858-87b4-84a310c21249","Type":"ContainerStarted","Data":"b7bc2b861ffd2f1d3a1ede5b87a767f230be0eb8b6d7f935ee21e11963a9f933"} Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.135621 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerStarted","Data":"e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff"} Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.137282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a1-account-create-xgf9d" event={"ID":"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda","Type":"ContainerStarted","Data":"9f6ef77319e408022ec777ecb97fe2e2810c4671c7b3774d61fbdf31d0dcfeb5"} Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.188780 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.1887505210000002 podStartE2EDuration="3.188750521s" podCreationTimestamp="2025-10-08 13:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:52.174630275 +0000 UTC m=+1261.982934809" watchObservedRunningTime="2025-10-08 13:16:52.188750521 +0000 UTC m=+1261.997055065" Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.196783 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-66dd-account-create-4q8hl"] Oct 08 13:16:52 crc kubenswrapper[4735]: W1008 13:16:52.239166 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72461694_86b9_4133_9970_302416099748.slice/crio-a19a659e0ef76b3e1fd8f1e7afc8bf4995578ed5db890412d1820c1078506725 WatchSource:0}: Error finding container a19a659e0ef76b3e1fd8f1e7afc8bf4995578ed5db890412d1820c1078506725: Status 404 returned error can't find the container with id a19a659e0ef76b3e1fd8f1e7afc8bf4995578ed5db890412d1820c1078506725 Oct 08 13:16:52 crc kubenswrapper[4735]: I1008 13:16:52.384135 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-726a-account-create-5szhf"] Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.170390 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a1-account-create-xgf9d" event={"ID":"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda","Type":"ContainerStarted","Data":"291cb29e3cba296e0d831cdae9bfa80b48fc271ca1f0dd6ea086ecdd10b45c37"} Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.177011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-726a-account-create-5szhf" event={"ID":"b452547c-5bac-4df5-a0e6-d969416877fc","Type":"ContainerStarted","Data":"caf2689e62a63c0e957ec5e45b184662c220651a79a995c9d7efc5643cfa216b"} Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.177318 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-726a-account-create-5szhf" event={"ID":"b452547c-5bac-4df5-a0e6-d969416877fc","Type":"ContainerStarted","Data":"fa92c82df8cba82355db34313d26195897daad9f2cd95531444bcca1bc2d76a1"} Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.178775 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66dd-account-create-4q8hl" event={"ID":"72461694-86b9-4133-9970-302416099748","Type":"ContainerStarted","Data":"96ea68b8faaa1a0dac49635d32f8379baed6ab70cff4b930e0003c6cce8a448f"} Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.178825 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66dd-account-create-4q8hl" event={"ID":"72461694-86b9-4133-9970-302416099748","Type":"ContainerStarted","Data":"a19a659e0ef76b3e1fd8f1e7afc8bf4995578ed5db890412d1820c1078506725"} Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.195550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerStarted","Data":"4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e"} Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.195716 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.195699 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-central-agent" containerID="cri-o://03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86" gracePeriod=30 Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.195967 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="sg-core" containerID="cri-o://e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff" gracePeriod=30 Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.196043 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="proxy-httpd" containerID="cri-o://4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e" gracePeriod=30 Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.195980 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-notification-agent" containerID="cri-o://20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952" gracePeriod=30 Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.198095 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-86a1-account-create-xgf9d" podStartSLOduration=3.198080087 podStartE2EDuration="3.198080087s" podCreationTimestamp="2025-10-08 13:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:53.185215473 +0000 UTC m=+1262.993520017" watchObservedRunningTime="2025-10-08 13:16:53.198080087 +0000 UTC m=+1263.006384631" Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.207294 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-726a-account-create-5szhf" podStartSLOduration=2.207276385 podStartE2EDuration="2.207276385s" podCreationTimestamp="2025-10-08 13:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:53.20206226 +0000 UTC m=+1263.010366804" watchObservedRunningTime="2025-10-08 13:16:53.207276385 +0000 UTC m=+1263.015580929" Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.229670 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-66dd-account-create-4q8hl" podStartSLOduration=3.229643324 podStartE2EDuration="3.229643324s" podCreationTimestamp="2025-10-08 13:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:16:53.225965339 +0000 UTC m=+1263.034269893" watchObservedRunningTime="2025-10-08 13:16:53.229643324 +0000 UTC m=+1263.037947858" Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.339821 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.834947079 podStartE2EDuration="7.339787299s" podCreationTimestamp="2025-10-08 13:16:46 +0000 UTC" firstStartedPulling="2025-10-08 13:16:46.888881043 +0000 UTC m=+1256.697185617" lastFinishedPulling="2025-10-08 13:16:52.393721293 +0000 UTC m=+1262.202025837" observedRunningTime="2025-10-08 13:16:53.252718452 +0000 UTC m=+1263.061022996" watchObservedRunningTime="2025-10-08 13:16:53.339787299 +0000 UTC m=+1263.148091853" Oct 08 13:16:53 crc kubenswrapper[4735]: I1008 13:16:53.825867 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.209093 4735 generic.go:334] "Generic (PLEG): container finished" podID="72461694-86b9-4133-9970-302416099748" containerID="96ea68b8faaa1a0dac49635d32f8379baed6ab70cff4b930e0003c6cce8a448f" exitCode=0 Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.209225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66dd-account-create-4q8hl" event={"ID":"72461694-86b9-4133-9970-302416099748","Type":"ContainerDied","Data":"96ea68b8faaa1a0dac49635d32f8379baed6ab70cff4b930e0003c6cce8a448f"} Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.220534 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerID="e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff" exitCode=2 Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.220566 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerID="20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952" exitCode=0 Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.220611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerDied","Data":"e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff"} Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.220638 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerDied","Data":"20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952"} Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.224369 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d4ee7a9-f521-442b-8c8e-354fbc7dbbda" containerID="291cb29e3cba296e0d831cdae9bfa80b48fc271ca1f0dd6ea086ecdd10b45c37" exitCode=0 Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.224451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a1-account-create-xgf9d" event={"ID":"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda","Type":"ContainerDied","Data":"291cb29e3cba296e0d831cdae9bfa80b48fc271ca1f0dd6ea086ecdd10b45c37"} Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.233242 4735 generic.go:334] "Generic (PLEG): container finished" podID="b452547c-5bac-4df5-a0e6-d969416877fc" containerID="caf2689e62a63c0e957ec5e45b184662c220651a79a995c9d7efc5643cfa216b" exitCode=0 Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.233291 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-726a-account-create-5szhf" event={"ID":"b452547c-5bac-4df5-a0e6-d969416877fc","Type":"ContainerDied","Data":"caf2689e62a63c0e957ec5e45b184662c220651a79a995c9d7efc5643cfa216b"} Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.366968 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.878315 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.879135 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.879317 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.880181 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f7b3e9d44554f3c931f516096c49ab670e3d36bde045f3e21679319bf30124a"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:16:54 crc kubenswrapper[4735]: I1008 13:16:54.880320 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://9f7b3e9d44554f3c931f516096c49ab670e3d36bde045f3e21679319bf30124a" gracePeriod=600 Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.247270 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="9f7b3e9d44554f3c931f516096c49ab670e3d36bde045f3e21679319bf30124a" exitCode=0 Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.247431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"9f7b3e9d44554f3c931f516096c49ab670e3d36bde045f3e21679319bf30124a"} Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.247490 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"271e91019953de2464d625764fc5ce2f322cb5494932f22012a17b3cd3952c78"} Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.247510 4735 scope.go:117] "RemoveContainer" containerID="933a14f7e22905eb9791da68c8e9bb68f5475b7d23363338a177f03a39161d6c" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.693623 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.774258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jmbj\" (UniqueName: \"kubernetes.io/projected/72461694-86b9-4133-9970-302416099748-kube-api-access-7jmbj\") pod \"72461694-86b9-4133-9970-302416099748\" (UID: \"72461694-86b9-4133-9970-302416099748\") " Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.780137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72461694-86b9-4133-9970-302416099748-kube-api-access-7jmbj" (OuterVolumeSpecName: "kube-api-access-7jmbj") pod "72461694-86b9-4133-9970-302416099748" (UID: "72461694-86b9-4133-9970-302416099748"). InnerVolumeSpecName "kube-api-access-7jmbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.800823 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.803257 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.876693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mq2\" (UniqueName: \"kubernetes.io/projected/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda-kube-api-access-59mq2\") pod \"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda\" (UID: \"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda\") " Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.877033 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spjnj\" (UniqueName: \"kubernetes.io/projected/b452547c-5bac-4df5-a0e6-d969416877fc-kube-api-access-spjnj\") pod \"b452547c-5bac-4df5-a0e6-d969416877fc\" (UID: \"b452547c-5bac-4df5-a0e6-d969416877fc\") " Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.877678 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jmbj\" (UniqueName: \"kubernetes.io/projected/72461694-86b9-4133-9970-302416099748-kube-api-access-7jmbj\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.883190 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda-kube-api-access-59mq2" (OuterVolumeSpecName: "kube-api-access-59mq2") pod "1d4ee7a9-f521-442b-8c8e-354fbc7dbbda" (UID: "1d4ee7a9-f521-442b-8c8e-354fbc7dbbda"). InnerVolumeSpecName "kube-api-access-59mq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.883410 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b452547c-5bac-4df5-a0e6-d969416877fc-kube-api-access-spjnj" (OuterVolumeSpecName: "kube-api-access-spjnj") pod "b452547c-5bac-4df5-a0e6-d969416877fc" (UID: "b452547c-5bac-4df5-a0e6-d969416877fc"). InnerVolumeSpecName "kube-api-access-spjnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.980931 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spjnj\" (UniqueName: \"kubernetes.io/projected/b452547c-5bac-4df5-a0e6-d969416877fc-kube-api-access-spjnj\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:55 crc kubenswrapper[4735]: I1008 13:16:55.981162 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mq2\" (UniqueName: \"kubernetes.io/projected/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda-kube-api-access-59mq2\") on node \"crc\" DevicePath \"\"" Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.262023 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86a1-account-create-xgf9d" event={"ID":"1d4ee7a9-f521-442b-8c8e-354fbc7dbbda","Type":"ContainerDied","Data":"9f6ef77319e408022ec777ecb97fe2e2810c4671c7b3774d61fbdf31d0dcfeb5"} Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.262094 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6ef77319e408022ec777ecb97fe2e2810c4671c7b3774d61fbdf31d0dcfeb5" Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.262035 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86a1-account-create-xgf9d" Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.263451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-726a-account-create-5szhf" event={"ID":"b452547c-5bac-4df5-a0e6-d969416877fc","Type":"ContainerDied","Data":"fa92c82df8cba82355db34313d26195897daad9f2cd95531444bcca1bc2d76a1"} Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.263499 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa92c82df8cba82355db34313d26195897daad9f2cd95531444bcca1bc2d76a1" Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.263458 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-726a-account-create-5szhf" Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.264819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66dd-account-create-4q8hl" event={"ID":"72461694-86b9-4133-9970-302416099748","Type":"ContainerDied","Data":"a19a659e0ef76b3e1fd8f1e7afc8bf4995578ed5db890412d1820c1078506725"} Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.264845 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19a659e0ef76b3e1fd8f1e7afc8bf4995578ed5db890412d1820c1078506725" Oct 08 13:16:56 crc kubenswrapper[4735]: I1008 13:16:56.264913 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66dd-account-create-4q8hl" Oct 08 13:16:59 crc kubenswrapper[4735]: I1008 13:16:59.294765 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerID="03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86" exitCode=0 Oct 08 13:16:59 crc kubenswrapper[4735]: I1008 13:16:59.294937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerDied","Data":"03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86"} Oct 08 13:16:59 crc kubenswrapper[4735]: I1008 13:16:59.586513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.203816 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7sl5"] Oct 08 13:17:01 crc kubenswrapper[4735]: E1008 13:17:01.204400 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4ee7a9-f521-442b-8c8e-354fbc7dbbda" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.204412 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4ee7a9-f521-442b-8c8e-354fbc7dbbda" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: E1008 13:17:01.204426 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72461694-86b9-4133-9970-302416099748" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.204432 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="72461694-86b9-4133-9970-302416099748" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: E1008 13:17:01.204452 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b452547c-5bac-4df5-a0e6-d969416877fc" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.204458 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b452547c-5bac-4df5-a0e6-d969416877fc" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.204637 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4ee7a9-f521-442b-8c8e-354fbc7dbbda" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.204657 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="72461694-86b9-4133-9970-302416099748" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.204674 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b452547c-5bac-4df5-a0e6-d969416877fc" containerName="mariadb-account-create" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.211806 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.213724 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vgjtw" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.213973 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.229572 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.231534 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7sl5"] Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.280096 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfg48\" (UniqueName: \"kubernetes.io/projected/c138807f-c003-44cc-ae0e-2491ba8602df-kube-api-access-gfg48\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.280318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.280637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-scripts\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.280702 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-config-data\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.382654 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-scripts\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.382709 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-config-data\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.382782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfg48\" (UniqueName: \"kubernetes.io/projected/c138807f-c003-44cc-ae0e-2491ba8602df-kube-api-access-gfg48\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.382851 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.397302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.397641 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-config-data\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.398214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-scripts\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.412903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfg48\" (UniqueName: \"kubernetes.io/projected/c138807f-c003-44cc-ae0e-2491ba8602df-kube-api-access-gfg48\") pod \"nova-cell0-conductor-db-sync-j7sl5\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:01 crc kubenswrapper[4735]: I1008 13:17:01.540733 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:02 crc kubenswrapper[4735]: I1008 13:17:02.015898 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7sl5"] Oct 08 13:17:02 crc kubenswrapper[4735]: I1008 13:17:02.330122 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" event={"ID":"c138807f-c003-44cc-ae0e-2491ba8602df","Type":"ContainerStarted","Data":"03a7537813cb5421b3f878927df33405c422ba52d91b1ddf9e1c4dc2bcb5bb75"} Oct 08 13:17:11 crc kubenswrapper[4735]: I1008 13:17:11.411443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" event={"ID":"c138807f-c003-44cc-ae0e-2491ba8602df","Type":"ContainerStarted","Data":"c8f2657ed8007a1620dd173bcf75075ec5b0c928943946ed6787c2bfb93faaa4"} Oct 08 13:17:11 crc kubenswrapper[4735]: I1008 13:17:11.432304 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" podStartSLOduration=1.858299976 podStartE2EDuration="10.432283508s" podCreationTimestamp="2025-10-08 13:17:01 +0000 UTC" firstStartedPulling="2025-10-08 13:17:02.024083268 +0000 UTC m=+1271.832387822" lastFinishedPulling="2025-10-08 13:17:10.59806681 +0000 UTC m=+1280.406371354" observedRunningTime="2025-10-08 13:17:11.426684773 +0000 UTC m=+1281.234989347" watchObservedRunningTime="2025-10-08 13:17:11.432283508 +0000 UTC m=+1281.240588052" Oct 08 13:17:16 crc kubenswrapper[4735]: I1008 13:17:16.435582 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 08 13:17:21 crc kubenswrapper[4735]: I1008 13:17:21.497252 4735 generic.go:334] "Generic (PLEG): container finished" podID="c138807f-c003-44cc-ae0e-2491ba8602df" containerID="c8f2657ed8007a1620dd173bcf75075ec5b0c928943946ed6787c2bfb93faaa4" exitCode=0 Oct 08 13:17:21 crc kubenswrapper[4735]: I1008 13:17:21.497300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" event={"ID":"c138807f-c003-44cc-ae0e-2491ba8602df","Type":"ContainerDied","Data":"c8f2657ed8007a1620dd173bcf75075ec5b0c928943946ed6787c2bfb93faaa4"} Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.844089 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.887600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-combined-ca-bundle\") pod \"c138807f-c003-44cc-ae0e-2491ba8602df\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.887781 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-scripts\") pod \"c138807f-c003-44cc-ae0e-2491ba8602df\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.887810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-config-data\") pod \"c138807f-c003-44cc-ae0e-2491ba8602df\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.887862 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfg48\" (UniqueName: \"kubernetes.io/projected/c138807f-c003-44cc-ae0e-2491ba8602df-kube-api-access-gfg48\") pod \"c138807f-c003-44cc-ae0e-2491ba8602df\" (UID: \"c138807f-c003-44cc-ae0e-2491ba8602df\") " Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.898702 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c138807f-c003-44cc-ae0e-2491ba8602df-kube-api-access-gfg48" (OuterVolumeSpecName: "kube-api-access-gfg48") pod "c138807f-c003-44cc-ae0e-2491ba8602df" (UID: "c138807f-c003-44cc-ae0e-2491ba8602df"). InnerVolumeSpecName "kube-api-access-gfg48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.898792 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-scripts" (OuterVolumeSpecName: "scripts") pod "c138807f-c003-44cc-ae0e-2491ba8602df" (UID: "c138807f-c003-44cc-ae0e-2491ba8602df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.917383 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-config-data" (OuterVolumeSpecName: "config-data") pod "c138807f-c003-44cc-ae0e-2491ba8602df" (UID: "c138807f-c003-44cc-ae0e-2491ba8602df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.917796 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c138807f-c003-44cc-ae0e-2491ba8602df" (UID: "c138807f-c003-44cc-ae0e-2491ba8602df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.989898 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.989944 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.989952 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c138807f-c003-44cc-ae0e-2491ba8602df-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:22 crc kubenswrapper[4735]: I1008 13:17:22.989964 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfg48\" (UniqueName: \"kubernetes.io/projected/c138807f-c003-44cc-ae0e-2491ba8602df-kube-api-access-gfg48\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.485319 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497623 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-config-data\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497756 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-log-httpd\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497786 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-run-httpd\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497833 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-ceilometer-tls-certs\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497858 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-combined-ca-bundle\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497909 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-scripts\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497955 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mplbv\" (UniqueName: \"kubernetes.io/projected/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-kube-api-access-mplbv\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.497980 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-sg-core-conf-yaml\") pod \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\" (UID: \"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4\") " Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.498639 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.499659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.503222 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-scripts" (OuterVolumeSpecName: "scripts") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.503318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-kube-api-access-mplbv" (OuterVolumeSpecName: "kube-api-access-mplbv") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "kube-api-access-mplbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.535834 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.544988 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerID="4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e" exitCode=137 Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.545087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerDied","Data":"4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e"} Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.545139 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4","Type":"ContainerDied","Data":"9d5a538af64959a7641301ace964597e6f1bf36bca4441b300b65f39f4b4c6eb"} Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.545158 4735 scope.go:117] "RemoveContainer" containerID="4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.545369 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.553074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" event={"ID":"c138807f-c003-44cc-ae0e-2491ba8602df","Type":"ContainerDied","Data":"03a7537813cb5421b3f878927df33405c422ba52d91b1ddf9e1c4dc2bcb5bb75"} Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.553124 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a7537813cb5421b3f878927df33405c422ba52d91b1ddf9e1c4dc2bcb5bb75" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.553311 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7sl5" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.576195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.577184 4735 scope.go:117] "RemoveContainer" containerID="e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.600134 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.600171 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.600182 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.600196 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.600207 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mplbv\" (UniqueName: \"kubernetes.io/projected/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-kube-api-access-mplbv\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.600217 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.619636 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.626655 4735 scope.go:117] "RemoveContainer" containerID="20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.634955 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.635044 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-config-data" (OuterVolumeSpecName: "config-data") pod "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" (UID: "c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.635896 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="sg-core" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.635922 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="sg-core" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.635966 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c138807f-c003-44cc-ae0e-2491ba8602df" containerName="nova-cell0-conductor-db-sync" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636084 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c138807f-c003-44cc-ae0e-2491ba8602df" containerName="nova-cell0-conductor-db-sync" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.636106 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-central-agent" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636114 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-central-agent" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.636134 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="proxy-httpd" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636143 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="proxy-httpd" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.636160 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-notification-agent" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636168 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-notification-agent" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636401 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-central-agent" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636425 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c138807f-c003-44cc-ae0e-2491ba8602df" containerName="nova-cell0-conductor-db-sync" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636440 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="sg-core" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636452 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="ceilometer-notification-agent" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.636528 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" containerName="proxy-httpd" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.637452 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.639827 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.640268 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vgjtw" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.646258 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.658139 4735 scope.go:117] "RemoveContainer" containerID="03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.678320 4735 scope.go:117] "RemoveContainer" containerID="4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.678983 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e\": container with ID starting with 4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e not found: ID does not exist" containerID="4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.679032 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e"} err="failed to get container status \"4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e\": rpc error: code = NotFound desc = could not find container \"4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e\": container with ID starting with 4a8fab144d84740b96c1e209e88fc9c8b1ee29293ddb7e50d51c8e95ffd6027e not found: ID does not exist" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.679065 4735 scope.go:117] "RemoveContainer" containerID="e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.679544 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff\": container with ID starting with e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff not found: ID does not exist" containerID="e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.679564 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff"} err="failed to get container status \"e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff\": rpc error: code = NotFound desc = could not find container \"e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff\": container with ID starting with e1244d8badbeeba1b7f1e9995d759cf99b22e94147e91d0d448fb758690bb3ff not found: ID does not exist" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.679576 4735 scope.go:117] "RemoveContainer" containerID="20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.679844 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952\": container with ID starting with 20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952 not found: ID does not exist" containerID="20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.679861 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952"} err="failed to get container status \"20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952\": rpc error: code = NotFound desc = could not find container \"20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952\": container with ID starting with 20cdcdcacf462e968ddff1efa5ae532a0c9694c09f2f71c8feec4785d9a23952 not found: ID does not exist" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.679874 4735 scope.go:117] "RemoveContainer" containerID="03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86" Oct 08 13:17:23 crc kubenswrapper[4735]: E1008 13:17:23.680164 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86\": container with ID starting with 03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86 not found: ID does not exist" containerID="03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.680191 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86"} err="failed to get container status \"03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86\": rpc error: code = NotFound desc = could not find container \"03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86\": container with ID starting with 03ff4f3b7915d4d32f5ee0480b31a19b966ae0f651351644c3ec7d0331ef0f86 not found: ID does not exist" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.702083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.702257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.702368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb57t\" (UniqueName: \"kubernetes.io/projected/84fa39bb-f66a-49ed-960c-7598ec87854a-kube-api-access-hb57t\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.702584 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.702663 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.804771 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.805157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.805299 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb57t\" (UniqueName: \"kubernetes.io/projected/84fa39bb-f66a-49ed-960c-7598ec87854a-kube-api-access-hb57t\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.809102 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.812332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.820538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb57t\" (UniqueName: \"kubernetes.io/projected/84fa39bb-f66a-49ed-960c-7598ec87854a-kube-api-access-hb57t\") pod \"nova-cell0-conductor-0\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.883413 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.892309 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.908089 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.917383 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.918813 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.920162 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.922736 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.923932 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:17:23 crc kubenswrapper[4735]: I1008 13:17:23.959896 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008130 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-run-httpd\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008561 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vcft\" (UniqueName: \"kubernetes.io/projected/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-kube-api-access-6vcft\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-log-httpd\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008771 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-scripts\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-config-data\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.008935 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.110581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-config-data\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.110633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.110710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-run-httpd\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.110785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vcft\" (UniqueName: \"kubernetes.io/projected/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-kube-api-access-6vcft\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.110838 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.110930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-log-httpd\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.111000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.111053 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-scripts\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.112811 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-run-httpd\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.112826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-log-httpd\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.116985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.117413 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-config-data\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.117889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.118755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.135707 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vcft\" (UniqueName: \"kubernetes.io/projected/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-kube-api-access-6vcft\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.136826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-scripts\") pod \"ceilometer-0\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.239758 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.407294 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.564449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"84fa39bb-f66a-49ed-960c-7598ec87854a","Type":"ContainerStarted","Data":"65961e73fd2089c78d050c0a8560cbc4a4a014886313015a340f2a2690973e3b"} Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.655526 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:17:24 crc kubenswrapper[4735]: W1008 13:17:24.660863 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b20f18_fb68_45d8_9b04_c0c4096c44a7.slice/crio-c44edb0bb8e63573b8d7cdc226dd53661403e796ef260b0aff9cd9dd5f9ec7f0 WatchSource:0}: Error finding container c44edb0bb8e63573b8d7cdc226dd53661403e796ef260b0aff9cd9dd5f9ec7f0: Status 404 returned error can't find the container with id c44edb0bb8e63573b8d7cdc226dd53661403e796ef260b0aff9cd9dd5f9ec7f0 Oct 08 13:17:24 crc kubenswrapper[4735]: I1008 13:17:24.904288 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4" path="/var/lib/kubelet/pods/c4b0e138-ca78-472c-b6cc-7d8f0a42e2b4/volumes" Oct 08 13:17:25 crc kubenswrapper[4735]: I1008 13:17:25.575758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"84fa39bb-f66a-49ed-960c-7598ec87854a","Type":"ContainerStarted","Data":"eb3a0242ded2d9f65f661e234e3d416b8c948c5e34c221a07c2a3c7ce662b2fb"} Oct 08 13:17:25 crc kubenswrapper[4735]: I1008 13:17:25.576079 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:25 crc kubenswrapper[4735]: I1008 13:17:25.577492 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerStarted","Data":"c44edb0bb8e63573b8d7cdc226dd53661403e796ef260b0aff9cd9dd5f9ec7f0"} Oct 08 13:17:25 crc kubenswrapper[4735]: I1008 13:17:25.593528 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.593503573 podStartE2EDuration="2.593503573s" podCreationTimestamp="2025-10-08 13:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:25.592065056 +0000 UTC m=+1295.400369620" watchObservedRunningTime="2025-10-08 13:17:25.593503573 +0000 UTC m=+1295.401808117" Oct 08 13:17:26 crc kubenswrapper[4735]: I1008 13:17:26.597931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerStarted","Data":"17739a3f7830b200456b6d2206c73fd49f8257d50ad7aa0a8c71b9123dea8c86"} Oct 08 13:17:26 crc kubenswrapper[4735]: I1008 13:17:26.597970 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerStarted","Data":"2d43f3b93ddd0f72a83e2fe39fafe892eb37bbe6b1005a8fe15e30558f6cb290"} Oct 08 13:17:27 crc kubenswrapper[4735]: I1008 13:17:27.607439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerStarted","Data":"4a83db09d56bf29053706700fad48316d4233b68bd94c951e3a217ed1ef2b57e"} Oct 08 13:17:29 crc kubenswrapper[4735]: I1008 13:17:29.627424 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerStarted","Data":"cac35ae518f55fdeecc2865584d2002921d9a9f5681e965ea39c2a3b4a2858db"} Oct 08 13:17:29 crc kubenswrapper[4735]: I1008 13:17:29.628081 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 13:17:29 crc kubenswrapper[4735]: I1008 13:17:29.698293 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.96585845 podStartE2EDuration="6.69826692s" podCreationTimestamp="2025-10-08 13:17:23 +0000 UTC" firstStartedPulling="2025-10-08 13:17:24.663619936 +0000 UTC m=+1294.471924480" lastFinishedPulling="2025-10-08 13:17:28.396028406 +0000 UTC m=+1298.204332950" observedRunningTime="2025-10-08 13:17:29.687333357 +0000 UTC m=+1299.495637901" watchObservedRunningTime="2025-10-08 13:17:29.69826692 +0000 UTC m=+1299.506571464" Oct 08 13:17:33 crc kubenswrapper[4735]: I1008 13:17:33.987399 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.438054 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-nhxn5"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.439651 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.443329 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.447398 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.450988 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nhxn5"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.499012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.499343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-scripts\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.499631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-config-data\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.499790 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nfkm\" (UniqueName: \"kubernetes.io/projected/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-kube-api-access-4nfkm\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.580649 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.582162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.585211 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.599381 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601342 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-config-data\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nfkm\" (UniqueName: \"kubernetes.io/projected/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-kube-api-access-4nfkm\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ddgz\" (UniqueName: \"kubernetes.io/projected/74384674-a8ff-43a8-b776-b26ff882ee27-kube-api-access-4ddgz\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74384674-a8ff-43a8-b776-b26ff882ee27-logs\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601489 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-config-data\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.601522 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-scripts\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.609977 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.610949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-scripts\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.613286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-config-data\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.656118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nfkm\" (UniqueName: \"kubernetes.io/projected/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-kube-api-access-4nfkm\") pod \"nova-cell0-cell-mapping-nhxn5\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.669544 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.671416 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.696962 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.703943 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ddgz\" (UniqueName: \"kubernetes.io/projected/74384674-a8ff-43a8-b776-b26ff882ee27-kube-api-access-4ddgz\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74384674-a8ff-43a8-b776-b26ff882ee27-logs\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-config-data\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-config-data\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704135 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1fc341-ba28-40e6-80cd-574dcfd48983-logs\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrr2\" (UniqueName: \"kubernetes.io/projected/5f1fc341-ba28-40e6-80cd-574dcfd48983-kube-api-access-grrr2\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.704605 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74384674-a8ff-43a8-b776-b26ff882ee27-logs\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.717170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.717748 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-config-data\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.718299 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.745131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ddgz\" (UniqueName: \"kubernetes.io/projected/74384674-a8ff-43a8-b776-b26ff882ee27-kube-api-access-4ddgz\") pod \"nova-api-0\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.752866 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.756986 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.763772 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.764567 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.785810 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.787046 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.798191 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.798622 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.807745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.807831 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grrr2\" (UniqueName: \"kubernetes.io/projected/5f1fc341-ba28-40e6-80cd-574dcfd48983-kube-api-access-grrr2\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.807868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.807940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7964t\" (UniqueName: \"kubernetes.io/projected/2861365a-b33e-4cf2-b2db-eb3e43c55535-kube-api-access-7964t\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808011 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-config-data\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808080 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808105 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6v7v\" (UniqueName: \"kubernetes.io/projected/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-kube-api-access-n6v7v\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-config-data\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1fc341-ba28-40e6-80cd-574dcfd48983-logs\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.808763 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1fc341-ba28-40e6-80cd-574dcfd48983-logs\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.820215 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.828005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-config-data\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.838887 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.848951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrr2\" (UniqueName: \"kubernetes.io/projected/5f1fc341-ba28-40e6-80cd-574dcfd48983-kube-api-access-grrr2\") pod \"nova-metadata-0\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.862463 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.900157 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.910321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.910453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7964t\" (UniqueName: \"kubernetes.io/projected/2861365a-b33e-4cf2-b2db-eb3e43c55535-kube-api-access-7964t\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.910597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-config-data\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.910665 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.910692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6v7v\" (UniqueName: \"kubernetes.io/projected/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-kube-api-access-n6v7v\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.910765 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.918596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.927829 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-config-data\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.928954 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.929868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.945587 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6v7v\" (UniqueName: \"kubernetes.io/projected/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-kube-api-access-n6v7v\") pod \"nova-scheduler-0\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.947501 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-6wrx7"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.949181 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-6wrx7"] Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.951151 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:34 crc kubenswrapper[4735]: I1008 13:17:34.953397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7964t\" (UniqueName: \"kubernetes.io/projected/2861365a-b33e-4cf2-b2db-eb3e43c55535-kube-api-access-7964t\") pod \"nova-cell1-novncproxy-0\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.115863 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-svc\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.115995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms5lf\" (UniqueName: \"kubernetes.io/projected/0f83be1f-9107-43fc-afad-d20a16cc0911-kube-api-access-ms5lf\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.116058 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-config\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.116120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.116149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.116177 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.193958 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.217748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-svc\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.217849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms5lf\" (UniqueName: \"kubernetes.io/projected/0f83be1f-9107-43fc-afad-d20a16cc0911-kube-api-access-ms5lf\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.217910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-config\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.217951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.217984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.218009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.219101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.219777 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-svc\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.219825 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-config\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.220439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.220541 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.221853 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.253054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms5lf\" (UniqueName: \"kubernetes.io/projected/0f83be1f-9107-43fc-afad-d20a16cc0911-kube-api-access-ms5lf\") pod \"dnsmasq-dns-6ffc974fdf-6wrx7\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.281245 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.398949 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nhxn5"] Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.410800 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:35 crc kubenswrapper[4735]: W1008 13:17:35.417909 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee80dca5_9f7b_47ca_82e9_2c210a8c507d.slice/crio-b597f7baa744555ce0757603c3b23559f40332358543886911eaa017379bcc7d WatchSource:0}: Error finding container b597f7baa744555ce0757603c3b23559f40332358543886911eaa017379bcc7d: Status 404 returned error can't find the container with id b597f7baa744555ce0757603c3b23559f40332358543886911eaa017379bcc7d Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.539149 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrws"] Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.540239 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.543027 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.544735 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.553563 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.567232 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrws"] Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.641625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-scripts\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.641964 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.642050 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-config-data\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.642437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkbnr\" (UniqueName: \"kubernetes.io/projected/826e70f1-e543-4a90-93ba-a671ee1767b2-kube-api-access-jkbnr\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.726008 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.742361 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f1fc341-ba28-40e6-80cd-574dcfd48983","Type":"ContainerStarted","Data":"0c7c501c370ce194388bec05f592c0e016fc572d893c2f8e81516bbadea1ae2e"} Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.744436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkbnr\" (UniqueName: \"kubernetes.io/projected/826e70f1-e543-4a90-93ba-a671ee1767b2-kube-api-access-jkbnr\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.744573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-scripts\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.744629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.744681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-config-data\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.745188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74384674-a8ff-43a8-b776-b26ff882ee27","Type":"ContainerStarted","Data":"5ec7000585e49f7667f9a841d830d90eec70f72b991f997565d83899251cd472"} Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.749073 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.750060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-config-data\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.756134 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.756285 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-scripts\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.760339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nhxn5" event={"ID":"ee80dca5-9f7b-47ca-82e9-2c210a8c507d","Type":"ContainerStarted","Data":"b597f7baa744555ce0757603c3b23559f40332358543886911eaa017379bcc7d"} Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.764122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkbnr\" (UniqueName: \"kubernetes.io/projected/826e70f1-e543-4a90-93ba-a671ee1767b2-kube-api-access-jkbnr\") pod \"nova-cell1-conductor-db-sync-ccrws\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.835549 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:35 crc kubenswrapper[4735]: I1008 13:17:35.932257 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-6wrx7"] Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.338548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrws"] Oct 08 13:17:36 crc kubenswrapper[4735]: W1008 13:17:36.345629 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod826e70f1_e543_4a90_93ba_a671ee1767b2.slice/crio-d9c5b3045a6c066b5ac257f986506560f67609d7119a56f3d9d78136ce896353 WatchSource:0}: Error finding container d9c5b3045a6c066b5ac257f986506560f67609d7119a56f3d9d78136ce896353: Status 404 returned error can't find the container with id d9c5b3045a6c066b5ac257f986506560f67609d7119a56f3d9d78136ce896353 Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.772921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2861365a-b33e-4cf2-b2db-eb3e43c55535","Type":"ContainerStarted","Data":"9efef6c5858793fc76a4049ce3e9065d4a9178cdd0bbca1bf55dffe2c21046d5"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.777909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071","Type":"ContainerStarted","Data":"4d2745920da8691f70efbd96e889a722dd8c29e609e5d94ccfcc2ed658e72255"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.781667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrws" event={"ID":"826e70f1-e543-4a90-93ba-a671ee1767b2","Type":"ContainerStarted","Data":"14d921e0baa7ca18db075fc0465d68042c4943f3d81357b3ee8a1af87fd6d390"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.781714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrws" event={"ID":"826e70f1-e543-4a90-93ba-a671ee1767b2","Type":"ContainerStarted","Data":"d9c5b3045a6c066b5ac257f986506560f67609d7119a56f3d9d78136ce896353"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.789435 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nhxn5" event={"ID":"ee80dca5-9f7b-47ca-82e9-2c210a8c507d","Type":"ContainerStarted","Data":"fe0b50aebd53217e466df2d09bfb549c0d6a2577d3c2470fded5a0c5db88732e"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.796258 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerID="777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c" exitCode=0 Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.796321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" event={"ID":"0f83be1f-9107-43fc-afad-d20a16cc0911","Type":"ContainerDied","Data":"777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.796349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" event={"ID":"0f83be1f-9107-43fc-afad-d20a16cc0911","Type":"ContainerStarted","Data":"38d913cdaaf70c0d6d56fa0be0bb952435b488307b5cebfe9d448eeb359031e3"} Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.804449 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ccrws" podStartSLOduration=1.804425586 podStartE2EDuration="1.804425586s" podCreationTimestamp="2025-10-08 13:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:36.799679983 +0000 UTC m=+1306.607984527" watchObservedRunningTime="2025-10-08 13:17:36.804425586 +0000 UTC m=+1306.612730140" Oct 08 13:17:36 crc kubenswrapper[4735]: I1008 13:17:36.851538 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-nhxn5" podStartSLOduration=2.8515220660000002 podStartE2EDuration="2.851522066s" podCreationTimestamp="2025-10-08 13:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:36.843138849 +0000 UTC m=+1306.651443393" watchObservedRunningTime="2025-10-08 13:17:36.851522066 +0000 UTC m=+1306.659826610" Oct 08 13:17:38 crc kubenswrapper[4735]: I1008 13:17:38.341529 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:38 crc kubenswrapper[4735]: I1008 13:17:38.355611 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.863808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" event={"ID":"0f83be1f-9107-43fc-afad-d20a16cc0911","Type":"ContainerStarted","Data":"df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.864319 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.866201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f1fc341-ba28-40e6-80cd-574dcfd48983","Type":"ContainerStarted","Data":"bfdc2bbddf5a559da3c78a8cb972c55f12ec5e9b3375226be6e18094289c9ea4"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.866230 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f1fc341-ba28-40e6-80cd-574dcfd48983","Type":"ContainerStarted","Data":"384094abefaa07d55b60c391c4f6b9a89bb4c01f9a5256e31cc3311408cadb4b"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.866291 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-log" containerID="cri-o://384094abefaa07d55b60c391c4f6b9a89bb4c01f9a5256e31cc3311408cadb4b" gracePeriod=30 Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.866309 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-metadata" containerID="cri-o://bfdc2bbddf5a559da3c78a8cb972c55f12ec5e9b3375226be6e18094289c9ea4" gracePeriod=30 Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.868400 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2861365a-b33e-4cf2-b2db-eb3e43c55535","Type":"ContainerStarted","Data":"f37619629254693188af3bf21fb4741a55f07777900aae9d0cdad72ae36cb38d"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.868562 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2861365a-b33e-4cf2-b2db-eb3e43c55535" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f37619629254693188af3bf21fb4741a55f07777900aae9d0cdad72ae36cb38d" gracePeriod=30 Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.876512 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74384674-a8ff-43a8-b776-b26ff882ee27","Type":"ContainerStarted","Data":"233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.876553 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74384674-a8ff-43a8-b776-b26ff882ee27","Type":"ContainerStarted","Data":"18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.879360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071","Type":"ContainerStarted","Data":"4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295"} Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.895236 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" podStartSLOduration=6.895220882 podStartE2EDuration="6.895220882s" podCreationTimestamp="2025-10-08 13:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:40.884588037 +0000 UTC m=+1310.692892591" watchObservedRunningTime="2025-10-08 13:17:40.895220882 +0000 UTC m=+1310.703525426" Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.907935 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.637309614 podStartE2EDuration="6.907914831s" podCreationTimestamp="2025-10-08 13:17:34 +0000 UTC" firstStartedPulling="2025-10-08 13:17:35.759060367 +0000 UTC m=+1305.567364911" lastFinishedPulling="2025-10-08 13:17:40.029665584 +0000 UTC m=+1309.837970128" observedRunningTime="2025-10-08 13:17:40.90170511 +0000 UTC m=+1310.710009654" watchObservedRunningTime="2025-10-08 13:17:40.907914831 +0000 UTC m=+1310.716219375" Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.927808 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.326354707 podStartE2EDuration="6.927785886s" podCreationTimestamp="2025-10-08 13:17:34 +0000 UTC" firstStartedPulling="2025-10-08 13:17:35.416134601 +0000 UTC m=+1305.224439145" lastFinishedPulling="2025-10-08 13:17:40.01756578 +0000 UTC m=+1309.825870324" observedRunningTime="2025-10-08 13:17:40.921915464 +0000 UTC m=+1310.730220008" watchObservedRunningTime="2025-10-08 13:17:40.927785886 +0000 UTC m=+1310.736090430" Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.939159 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5849035860000003 podStartE2EDuration="6.938851023s" podCreationTimestamp="2025-10-08 13:17:34 +0000 UTC" firstStartedPulling="2025-10-08 13:17:35.645612117 +0000 UTC m=+1305.453916661" lastFinishedPulling="2025-10-08 13:17:39.999559544 +0000 UTC m=+1309.807864098" observedRunningTime="2025-10-08 13:17:40.937652712 +0000 UTC m=+1310.745957246" watchObservedRunningTime="2025-10-08 13:17:40.938851023 +0000 UTC m=+1310.747155577" Oct 08 13:17:40 crc kubenswrapper[4735]: I1008 13:17:40.958282 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6994012339999998 podStartE2EDuration="6.958263096s" podCreationTimestamp="2025-10-08 13:17:34 +0000 UTC" firstStartedPulling="2025-10-08 13:17:35.740396044 +0000 UTC m=+1305.548700608" lastFinishedPulling="2025-10-08 13:17:39.999257926 +0000 UTC m=+1309.807562470" observedRunningTime="2025-10-08 13:17:40.954210261 +0000 UTC m=+1310.762514795" watchObservedRunningTime="2025-10-08 13:17:40.958263096 +0000 UTC m=+1310.766567640" Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.889736 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerID="bfdc2bbddf5a559da3c78a8cb972c55f12ec5e9b3375226be6e18094289c9ea4" exitCode=0 Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.890061 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerID="384094abefaa07d55b60c391c4f6b9a89bb4c01f9a5256e31cc3311408cadb4b" exitCode=143 Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.889803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f1fc341-ba28-40e6-80cd-574dcfd48983","Type":"ContainerDied","Data":"bfdc2bbddf5a559da3c78a8cb972c55f12ec5e9b3375226be6e18094289c9ea4"} Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.890102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f1fc341-ba28-40e6-80cd-574dcfd48983","Type":"ContainerDied","Data":"384094abefaa07d55b60c391c4f6b9a89bb4c01f9a5256e31cc3311408cadb4b"} Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.890120 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f1fc341-ba28-40e6-80cd-574dcfd48983","Type":"ContainerDied","Data":"0c7c501c370ce194388bec05f592c0e016fc572d893c2f8e81516bbadea1ae2e"} Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.890132 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c7c501c370ce194388bec05f592c0e016fc572d893c2f8e81516bbadea1ae2e" Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.942331 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:41 crc kubenswrapper[4735]: I1008 13:17:41.999828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-combined-ca-bundle\") pod \"5f1fc341-ba28-40e6-80cd-574dcfd48983\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:41.999903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-config-data\") pod \"5f1fc341-ba28-40e6-80cd-574dcfd48983\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:41.999999 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1fc341-ba28-40e6-80cd-574dcfd48983-logs\") pod \"5f1fc341-ba28-40e6-80cd-574dcfd48983\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.000089 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grrr2\" (UniqueName: \"kubernetes.io/projected/5f1fc341-ba28-40e6-80cd-574dcfd48983-kube-api-access-grrr2\") pod \"5f1fc341-ba28-40e6-80cd-574dcfd48983\" (UID: \"5f1fc341-ba28-40e6-80cd-574dcfd48983\") " Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.005694 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1fc341-ba28-40e6-80cd-574dcfd48983-logs" (OuterVolumeSpecName: "logs") pod "5f1fc341-ba28-40e6-80cd-574dcfd48983" (UID: "5f1fc341-ba28-40e6-80cd-574dcfd48983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.008819 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1fc341-ba28-40e6-80cd-574dcfd48983-kube-api-access-grrr2" (OuterVolumeSpecName: "kube-api-access-grrr2") pod "5f1fc341-ba28-40e6-80cd-574dcfd48983" (UID: "5f1fc341-ba28-40e6-80cd-574dcfd48983"). InnerVolumeSpecName "kube-api-access-grrr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.033045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f1fc341-ba28-40e6-80cd-574dcfd48983" (UID: "5f1fc341-ba28-40e6-80cd-574dcfd48983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.044909 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-config-data" (OuterVolumeSpecName: "config-data") pod "5f1fc341-ba28-40e6-80cd-574dcfd48983" (UID: "5f1fc341-ba28-40e6-80cd-574dcfd48983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.102652 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1fc341-ba28-40e6-80cd-574dcfd48983-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.102685 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grrr2\" (UniqueName: \"kubernetes.io/projected/5f1fc341-ba28-40e6-80cd-574dcfd48983-kube-api-access-grrr2\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.102697 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.102706 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1fc341-ba28-40e6-80cd-574dcfd48983-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.903265 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.958046 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:42 crc kubenswrapper[4735]: I1008 13:17:42.988718 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.001076 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:43 crc kubenswrapper[4735]: E1008 13:17:43.001670 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-log" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.001699 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-log" Oct 08 13:17:43 crc kubenswrapper[4735]: E1008 13:17:43.001809 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-metadata" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.002062 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-metadata" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.002382 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-log" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.002812 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" containerName="nova-metadata-metadata" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.005347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.007788 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.012953 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.014109 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.130920 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.131060 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwqgk\" (UniqueName: \"kubernetes.io/projected/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-kube-api-access-wwqgk\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.131126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.131175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-logs\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.131196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-config-data\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.232963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-logs\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.233010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-config-data\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.233071 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.233166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwqgk\" (UniqueName: \"kubernetes.io/projected/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-kube-api-access-wwqgk\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.233222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.235525 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-logs\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.245467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.245904 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.251394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-config-data\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.253858 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwqgk\" (UniqueName: \"kubernetes.io/projected/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-kube-api-access-wwqgk\") pod \"nova-metadata-0\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.328702 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.778598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:43 crc kubenswrapper[4735]: I1008 13:17:43.916231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6","Type":"ContainerStarted","Data":"5da538b9f570d9c384c25e6bf4e237793d459ff9faa0efcd56ebffc83b07cbfe"} Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.906647 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1fc341-ba28-40e6-80cd-574dcfd48983" path="/var/lib/kubelet/pods/5f1fc341-ba28-40e6-80cd-574dcfd48983/volumes" Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.907443 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.907465 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.928747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6","Type":"ContainerStarted","Data":"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1"} Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.928790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6","Type":"ContainerStarted","Data":"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173"} Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.931903 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee80dca5-9f7b-47ca-82e9-2c210a8c507d" containerID="fe0b50aebd53217e466df2d09bfb549c0d6a2577d3c2470fded5a0c5db88732e" exitCode=0 Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.931939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nhxn5" event={"ID":"ee80dca5-9f7b-47ca-82e9-2c210a8c507d","Type":"ContainerDied","Data":"fe0b50aebd53217e466df2d09bfb549c0d6a2577d3c2470fded5a0c5db88732e"} Oct 08 13:17:44 crc kubenswrapper[4735]: I1008 13:17:44.957219 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.957197952 podStartE2EDuration="2.957197952s" podCreationTimestamp="2025-10-08 13:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:44.957186912 +0000 UTC m=+1314.765491456" watchObservedRunningTime="2025-10-08 13:17:44.957197952 +0000 UTC m=+1314.765502506" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.194657 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.194888 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.222697 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.227165 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.283658 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.351532 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-4rn4d"] Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.351763 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" podUID="6819ca75-acc5-4a08-93d0-c58685908257" containerName="dnsmasq-dns" containerID="cri-o://fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e" gracePeriod=10 Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.905923 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.972722 4735 generic.go:334] "Generic (PLEG): container finished" podID="826e70f1-e543-4a90-93ba-a671ee1767b2" containerID="14d921e0baa7ca18db075fc0465d68042c4943f3d81357b3ee8a1af87fd6d390" exitCode=0 Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.972828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrws" event={"ID":"826e70f1-e543-4a90-93ba-a671ee1767b2","Type":"ContainerDied","Data":"14d921e0baa7ca18db075fc0465d68042c4943f3d81357b3ee8a1af87fd6d390"} Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.983692 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.984882 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.989018 4735 generic.go:334] "Generic (PLEG): container finished" podID="6819ca75-acc5-4a08-93d0-c58685908257" containerID="fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e" exitCode=0 Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.989934 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-sb\") pod \"6819ca75-acc5-4a08-93d0-c58685908257\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.990000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-config\") pod \"6819ca75-acc5-4a08-93d0-c58685908257\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.990022 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-nb\") pod \"6819ca75-acc5-4a08-93d0-c58685908257\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.990107 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-svc\") pod \"6819ca75-acc5-4a08-93d0-c58685908257\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.990145 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-swift-storage-0\") pod \"6819ca75-acc5-4a08-93d0-c58685908257\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.990215 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsghf\" (UniqueName: \"kubernetes.io/projected/6819ca75-acc5-4a08-93d0-c58685908257-kube-api-access-wsghf\") pod \"6819ca75-acc5-4a08-93d0-c58685908257\" (UID: \"6819ca75-acc5-4a08-93d0-c58685908257\") " Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.991036 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.992723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" event={"ID":"6819ca75-acc5-4a08-93d0-c58685908257","Type":"ContainerDied","Data":"fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e"} Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.992769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-4rn4d" event={"ID":"6819ca75-acc5-4a08-93d0-c58685908257","Type":"ContainerDied","Data":"02297bd6a3ddb02680c567ce4c77f24e12e80b85d8400e43bcbf7c8d5591a531"} Oct 08 13:17:45 crc kubenswrapper[4735]: I1008 13:17:45.992790 4735 scope.go:117] "RemoveContainer" containerID="fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.019353 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6819ca75-acc5-4a08-93d0-c58685908257-kube-api-access-wsghf" (OuterVolumeSpecName: "kube-api-access-wsghf") pod "6819ca75-acc5-4a08-93d0-c58685908257" (UID: "6819ca75-acc5-4a08-93d0-c58685908257"). InnerVolumeSpecName "kube-api-access-wsghf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.034653 4735 scope.go:117] "RemoveContainer" containerID="10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.053874 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.085371 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-config" (OuterVolumeSpecName: "config") pod "6819ca75-acc5-4a08-93d0-c58685908257" (UID: "6819ca75-acc5-4a08-93d0-c58685908257"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.094077 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.094109 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsghf\" (UniqueName: \"kubernetes.io/projected/6819ca75-acc5-4a08-93d0-c58685908257-kube-api-access-wsghf\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.111073 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6819ca75-acc5-4a08-93d0-c58685908257" (UID: "6819ca75-acc5-4a08-93d0-c58685908257"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.115685 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6819ca75-acc5-4a08-93d0-c58685908257" (UID: "6819ca75-acc5-4a08-93d0-c58685908257"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.118766 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6819ca75-acc5-4a08-93d0-c58685908257" (UID: "6819ca75-acc5-4a08-93d0-c58685908257"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.134178 4735 scope.go:117] "RemoveContainer" containerID="fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.134911 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6819ca75-acc5-4a08-93d0-c58685908257" (UID: "6819ca75-acc5-4a08-93d0-c58685908257"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: E1008 13:17:46.135008 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e\": container with ID starting with fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e not found: ID does not exist" containerID="fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.135053 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e"} err="failed to get container status \"fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e\": rpc error: code = NotFound desc = could not find container \"fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e\": container with ID starting with fd4d662762db3c9dc648b8dfa89bbc23539407d1c35bc2f4b1809028f3aeca4e not found: ID does not exist" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.135079 4735 scope.go:117] "RemoveContainer" containerID="10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f" Oct 08 13:17:46 crc kubenswrapper[4735]: E1008 13:17:46.135633 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f\": container with ID starting with 10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f not found: ID does not exist" containerID="10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.135668 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f"} err="failed to get container status \"10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f\": rpc error: code = NotFound desc = could not find container \"10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f\": container with ID starting with 10364ef3a94c2a9fd107b17397384560b69737827150d506ad75af5de4b8810f not found: ID does not exist" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.197562 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.197592 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.197603 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.197612 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6819ca75-acc5-4a08-93d0-c58685908257-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.299896 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.344898 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-4rn4d"] Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.357141 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-4rn4d"] Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.400329 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-scripts\") pod \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.400383 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-config-data\") pod \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.400413 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-combined-ca-bundle\") pod \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.400545 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nfkm\" (UniqueName: \"kubernetes.io/projected/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-kube-api-access-4nfkm\") pod \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\" (UID: \"ee80dca5-9f7b-47ca-82e9-2c210a8c507d\") " Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.405227 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-scripts" (OuterVolumeSpecName: "scripts") pod "ee80dca5-9f7b-47ca-82e9-2c210a8c507d" (UID: "ee80dca5-9f7b-47ca-82e9-2c210a8c507d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.405356 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-kube-api-access-4nfkm" (OuterVolumeSpecName: "kube-api-access-4nfkm") pod "ee80dca5-9f7b-47ca-82e9-2c210a8c507d" (UID: "ee80dca5-9f7b-47ca-82e9-2c210a8c507d"). InnerVolumeSpecName "kube-api-access-4nfkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.427229 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee80dca5-9f7b-47ca-82e9-2c210a8c507d" (UID: "ee80dca5-9f7b-47ca-82e9-2c210a8c507d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.432282 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-config-data" (OuterVolumeSpecName: "config-data") pod "ee80dca5-9f7b-47ca-82e9-2c210a8c507d" (UID: "ee80dca5-9f7b-47ca-82e9-2c210a8c507d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.502934 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.503285 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.503300 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.503314 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nfkm\" (UniqueName: \"kubernetes.io/projected/ee80dca5-9f7b-47ca-82e9-2c210a8c507d-kube-api-access-4nfkm\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:46 crc kubenswrapper[4735]: I1008 13:17:46.906214 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6819ca75-acc5-4a08-93d0-c58685908257" path="/var/lib/kubelet/pods/6819ca75-acc5-4a08-93d0-c58685908257/volumes" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.024504 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nhxn5" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.025130 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nhxn5" event={"ID":"ee80dca5-9f7b-47ca-82e9-2c210a8c507d","Type":"ContainerDied","Data":"b597f7baa744555ce0757603c3b23559f40332358543886911eaa017379bcc7d"} Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.025158 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b597f7baa744555ce0757603c3b23559f40332358543886911eaa017379bcc7d" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.177145 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.177418 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-log" containerID="cri-o://18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0" gracePeriod=30 Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.177896 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-api" containerID="cri-o://233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5" gracePeriod=30 Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.187262 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.252268 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.252995 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-log" containerID="cri-o://58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173" gracePeriod=30 Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.253066 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-metadata" containerID="cri-o://1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1" gracePeriod=30 Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.552562 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.632611 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-config-data\") pod \"826e70f1-e543-4a90-93ba-a671ee1767b2\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.633155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-combined-ca-bundle\") pod \"826e70f1-e543-4a90-93ba-a671ee1767b2\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.633208 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkbnr\" (UniqueName: \"kubernetes.io/projected/826e70f1-e543-4a90-93ba-a671ee1767b2-kube-api-access-jkbnr\") pod \"826e70f1-e543-4a90-93ba-a671ee1767b2\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.633314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-scripts\") pod \"826e70f1-e543-4a90-93ba-a671ee1767b2\" (UID: \"826e70f1-e543-4a90-93ba-a671ee1767b2\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.642760 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826e70f1-e543-4a90-93ba-a671ee1767b2-kube-api-access-jkbnr" (OuterVolumeSpecName: "kube-api-access-jkbnr") pod "826e70f1-e543-4a90-93ba-a671ee1767b2" (UID: "826e70f1-e543-4a90-93ba-a671ee1767b2"). InnerVolumeSpecName "kube-api-access-jkbnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.655732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-scripts" (OuterVolumeSpecName: "scripts") pod "826e70f1-e543-4a90-93ba-a671ee1767b2" (UID: "826e70f1-e543-4a90-93ba-a671ee1767b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.682309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-config-data" (OuterVolumeSpecName: "config-data") pod "826e70f1-e543-4a90-93ba-a671ee1767b2" (UID: "826e70f1-e543-4a90-93ba-a671ee1767b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.684955 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "826e70f1-e543-4a90-93ba-a671ee1767b2" (UID: "826e70f1-e543-4a90-93ba-a671ee1767b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.735331 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.735370 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.735384 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkbnr\" (UniqueName: \"kubernetes.io/projected/826e70f1-e543-4a90-93ba-a671ee1767b2-kube-api-access-jkbnr\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.735397 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826e70f1-e543-4a90-93ba-a671ee1767b2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.815219 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.939190 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-nova-metadata-tls-certs\") pod \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.939321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-combined-ca-bundle\") pod \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.939389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-config-data\") pod \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.939438 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwqgk\" (UniqueName: \"kubernetes.io/projected/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-kube-api-access-wwqgk\") pod \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.939477 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-logs\") pod \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\" (UID: \"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6\") " Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.940147 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-logs" (OuterVolumeSpecName: "logs") pod "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" (UID: "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.946302 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-kube-api-access-wwqgk" (OuterVolumeSpecName: "kube-api-access-wwqgk") pod "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" (UID: "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6"). InnerVolumeSpecName "kube-api-access-wwqgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.967090 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-config-data" (OuterVolumeSpecName: "config-data") pod "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" (UID: "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:47 crc kubenswrapper[4735]: I1008 13:17:47.968654 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" (UID: "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.004980 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" (UID: "81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.041135 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.041171 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.041185 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwqgk\" (UniqueName: \"kubernetes.io/projected/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-kube-api-access-wwqgk\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.041198 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.041211 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.041993 4735 generic.go:334] "Generic (PLEG): container finished" podID="74384674-a8ff-43a8-b776-b26ff882ee27" containerID="18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0" exitCode=143 Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.042089 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74384674-a8ff-43a8-b776-b26ff882ee27","Type":"ContainerDied","Data":"18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0"} Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.045679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccrws" event={"ID":"826e70f1-e543-4a90-93ba-a671ee1767b2","Type":"ContainerDied","Data":"d9c5b3045a6c066b5ac257f986506560f67609d7119a56f3d9d78136ce896353"} Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.045734 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9c5b3045a6c066b5ac257f986506560f67609d7119a56f3d9d78136ce896353" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.045806 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccrws" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048025 4735 generic.go:334] "Generic (PLEG): container finished" podID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerID="1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1" exitCode=0 Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048055 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6","Type":"ContainerDied","Data":"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1"} Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048098 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6","Type":"ContainerDied","Data":"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173"} Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048117 4735 scope.go:117] "RemoveContainer" containerID="1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048058 4735 generic.go:334] "Generic (PLEG): container finished" podID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerID="58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173" exitCode=143 Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.048336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6","Type":"ContainerDied","Data":"5da538b9f570d9c384c25e6bf4e237793d459ff9faa0efcd56ebffc83b07cbfe"} Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.077652 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.081535 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6819ca75-acc5-4a08-93d0-c58685908257" containerName="init" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.081575 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6819ca75-acc5-4a08-93d0-c58685908257" containerName="init" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.081602 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826e70f1-e543-4a90-93ba-a671ee1767b2" containerName="nova-cell1-conductor-db-sync" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.081611 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="826e70f1-e543-4a90-93ba-a671ee1767b2" containerName="nova-cell1-conductor-db-sync" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.081619 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-metadata" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.081628 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-metadata" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.081642 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6819ca75-acc5-4a08-93d0-c58685908257" containerName="dnsmasq-dns" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.081649 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6819ca75-acc5-4a08-93d0-c58685908257" containerName="dnsmasq-dns" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.081685 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee80dca5-9f7b-47ca-82e9-2c210a8c507d" containerName="nova-manage" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.081692 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee80dca5-9f7b-47ca-82e9-2c210a8c507d" containerName="nova-manage" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.081707 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-log" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.081715 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-log" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.082059 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-log" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.082078 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="826e70f1-e543-4a90-93ba-a671ee1767b2" containerName="nova-cell1-conductor-db-sync" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.082097 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee80dca5-9f7b-47ca-82e9-2c210a8c507d" containerName="nova-manage" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.082110 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6819ca75-acc5-4a08-93d0-c58685908257" containerName="dnsmasq-dns" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.082119 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" containerName="nova-metadata-metadata" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.083027 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.085040 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.096304 4735 scope.go:117] "RemoveContainer" containerID="58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.101592 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.118440 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.130406 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.134702 4735 scope.go:117] "RemoveContainer" containerID="1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.135215 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1\": container with ID starting with 1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1 not found: ID does not exist" containerID="1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.135267 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1"} err="failed to get container status \"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1\": rpc error: code = NotFound desc = could not find container \"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1\": container with ID starting with 1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1 not found: ID does not exist" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.135302 4735 scope.go:117] "RemoveContainer" containerID="58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173" Oct 08 13:17:48 crc kubenswrapper[4735]: E1008 13:17:48.137545 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173\": container with ID starting with 58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173 not found: ID does not exist" containerID="58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.137588 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173"} err="failed to get container status \"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173\": rpc error: code = NotFound desc = could not find container \"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173\": container with ID starting with 58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173 not found: ID does not exist" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.137615 4735 scope.go:117] "RemoveContainer" containerID="1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.138141 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1"} err="failed to get container status \"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1\": rpc error: code = NotFound desc = could not find container \"1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1\": container with ID starting with 1854abee6edb5d8d474f3526f3fbde3a02ded75c3ba7c61a9317ed055b93f0b1 not found: ID does not exist" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.138196 4735 scope.go:117] "RemoveContainer" containerID="58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.139164 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173"} err="failed to get container status \"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173\": rpc error: code = NotFound desc = could not find container \"58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173\": container with ID starting with 58385b9eca5b3273cbf0f780a46c3f27cf81120010f1b80c3965e5d8d2c0b173 not found: ID does not exist" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.143286 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.144922 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvbf\" (UniqueName: \"kubernetes.io/projected/8953f639-e468-4f85-8093-6c29fa380d8a-kube-api-access-vtvbf\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.145061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.145266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.145393 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.147836 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.152101 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.171074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.247685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvbf\" (UniqueName: \"kubernetes.io/projected/8953f639-e468-4f85-8093-6c29fa380d8a-kube-api-access-vtvbf\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.247753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.247847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzj76\" (UniqueName: \"kubernetes.io/projected/ee128310-e5f8-4197-9c56-caa724cd15fc-kube-api-access-rzj76\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.247932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.247966 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.248013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee128310-e5f8-4197-9c56-caa724cd15fc-logs\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.248068 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.248134 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-config-data\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.252630 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.252639 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.271511 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvbf\" (UniqueName: \"kubernetes.io/projected/8953f639-e468-4f85-8093-6c29fa380d8a-kube-api-access-vtvbf\") pod \"nova-cell1-conductor-0\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.350115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzj76\" (UniqueName: \"kubernetes.io/projected/ee128310-e5f8-4197-9c56-caa724cd15fc-kube-api-access-rzj76\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.350252 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.350310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee128310-e5f8-4197-9c56-caa724cd15fc-logs\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.350356 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.350415 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-config-data\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.351184 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee128310-e5f8-4197-9c56-caa724cd15fc-logs\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.353954 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.354634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.355089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-config-data\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.368302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzj76\" (UniqueName: \"kubernetes.io/projected/ee128310-e5f8-4197-9c56-caa724cd15fc-kube-api-access-rzj76\") pod \"nova-metadata-0\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.405654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.489106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.869411 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.911133 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6" path="/var/lib/kubelet/pods/81cfed7f-ac2b-4d3b-9f2d-4c5fa6f4e6f6/volumes" Oct 08 13:17:48 crc kubenswrapper[4735]: I1008 13:17:48.977554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:17:48 crc kubenswrapper[4735]: W1008 13:17:48.978900 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee128310_e5f8_4197_9c56_caa724cd15fc.slice/crio-927fec1187ffc26cdbbf7cc3ae7df8bb9395ae0b402be62c7c3fc4d0a94b295e WatchSource:0}: Error finding container 927fec1187ffc26cdbbf7cc3ae7df8bb9395ae0b402be62c7c3fc4d0a94b295e: Status 404 returned error can't find the container with id 927fec1187ffc26cdbbf7cc3ae7df8bb9395ae0b402be62c7c3fc4d0a94b295e Oct 08 13:17:49 crc kubenswrapper[4735]: I1008 13:17:49.065284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee128310-e5f8-4197-9c56-caa724cd15fc","Type":"ContainerStarted","Data":"927fec1187ffc26cdbbf7cc3ae7df8bb9395ae0b402be62c7c3fc4d0a94b295e"} Oct 08 13:17:49 crc kubenswrapper[4735]: I1008 13:17:49.066700 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8953f639-e468-4f85-8093-6c29fa380d8a","Type":"ContainerStarted","Data":"93dba9ad72f36fc0015ef57446883bf6d77cfc2be679a3477a88c6d950bfd50f"} Oct 08 13:17:49 crc kubenswrapper[4735]: I1008 13:17:49.068238 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" containerName="nova-scheduler-scheduler" containerID="cri-o://4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295" gracePeriod=30 Oct 08 13:17:50 crc kubenswrapper[4735]: I1008 13:17:50.078691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee128310-e5f8-4197-9c56-caa724cd15fc","Type":"ContainerStarted","Data":"1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477"} Oct 08 13:17:50 crc kubenswrapper[4735]: I1008 13:17:50.079002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee128310-e5f8-4197-9c56-caa724cd15fc","Type":"ContainerStarted","Data":"54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd"} Oct 08 13:17:50 crc kubenswrapper[4735]: I1008 13:17:50.082715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8953f639-e468-4f85-8093-6c29fa380d8a","Type":"ContainerStarted","Data":"fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf"} Oct 08 13:17:50 crc kubenswrapper[4735]: I1008 13:17:50.083271 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:50 crc kubenswrapper[4735]: I1008 13:17:50.103453 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.1034331 podStartE2EDuration="2.1034331s" podCreationTimestamp="2025-10-08 13:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:50.098930993 +0000 UTC m=+1319.907235537" watchObservedRunningTime="2025-10-08 13:17:50.1034331 +0000 UTC m=+1319.911737644" Oct 08 13:17:50 crc kubenswrapper[4735]: I1008 13:17:50.123663 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.123645743 podStartE2EDuration="2.123645743s" podCreationTimestamp="2025-10-08 13:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:50.120017959 +0000 UTC m=+1319.928322523" watchObservedRunningTime="2025-10-08 13:17:50.123645743 +0000 UTC m=+1319.931950287" Oct 08 13:17:50 crc kubenswrapper[4735]: E1008 13:17:50.196282 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:17:50 crc kubenswrapper[4735]: E1008 13:17:50.198465 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:17:50 crc kubenswrapper[4735]: E1008 13:17:50.199760 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:17:50 crc kubenswrapper[4735]: E1008 13:17:50.199825 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" containerName="nova-scheduler-scheduler" Oct 08 13:17:51 crc kubenswrapper[4735]: I1008 13:17:51.977150 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.024324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ddgz\" (UniqueName: \"kubernetes.io/projected/74384674-a8ff-43a8-b776-b26ff882ee27-kube-api-access-4ddgz\") pod \"74384674-a8ff-43a8-b776-b26ff882ee27\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.024445 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-combined-ca-bundle\") pod \"74384674-a8ff-43a8-b776-b26ff882ee27\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.024672 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74384674-a8ff-43a8-b776-b26ff882ee27-logs\") pod \"74384674-a8ff-43a8-b776-b26ff882ee27\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.024733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-config-data\") pod \"74384674-a8ff-43a8-b776-b26ff882ee27\" (UID: \"74384674-a8ff-43a8-b776-b26ff882ee27\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.025393 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74384674-a8ff-43a8-b776-b26ff882ee27-logs" (OuterVolumeSpecName: "logs") pod "74384674-a8ff-43a8-b776-b26ff882ee27" (UID: "74384674-a8ff-43a8-b776-b26ff882ee27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.029777 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74384674-a8ff-43a8-b776-b26ff882ee27-kube-api-access-4ddgz" (OuterVolumeSpecName: "kube-api-access-4ddgz") pod "74384674-a8ff-43a8-b776-b26ff882ee27" (UID: "74384674-a8ff-43a8-b776-b26ff882ee27"). InnerVolumeSpecName "kube-api-access-4ddgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.053153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74384674-a8ff-43a8-b776-b26ff882ee27" (UID: "74384674-a8ff-43a8-b776-b26ff882ee27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.055101 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-config-data" (OuterVolumeSpecName: "config-data") pod "74384674-a8ff-43a8-b776-b26ff882ee27" (UID: "74384674-a8ff-43a8-b776-b26ff882ee27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.106693 4735 generic.go:334] "Generic (PLEG): container finished" podID="74384674-a8ff-43a8-b776-b26ff882ee27" containerID="233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5" exitCode=0 Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.106803 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.107943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74384674-a8ff-43a8-b776-b26ff882ee27","Type":"ContainerDied","Data":"233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5"} Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.108012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"74384674-a8ff-43a8-b776-b26ff882ee27","Type":"ContainerDied","Data":"5ec7000585e49f7667f9a841d830d90eec70f72b991f997565d83899251cd472"} Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.108034 4735 scope.go:117] "RemoveContainer" containerID="233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.112797 4735 generic.go:334] "Generic (PLEG): container finished" podID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" containerID="4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295" exitCode=0 Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.113064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071","Type":"ContainerDied","Data":"4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295"} Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.127062 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74384674-a8ff-43a8-b776-b26ff882ee27-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.127117 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.127157 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ddgz\" (UniqueName: \"kubernetes.io/projected/74384674-a8ff-43a8-b776-b26ff882ee27-kube-api-access-4ddgz\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.127166 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74384674-a8ff-43a8-b776-b26ff882ee27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.168470 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.176353 4735 scope.go:117] "RemoveContainer" containerID="18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.180993 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.197712 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:52 crc kubenswrapper[4735]: E1008 13:17:52.198814 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-api" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.198832 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-api" Oct 08 13:17:52 crc kubenswrapper[4735]: E1008 13:17:52.198894 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-log" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.198903 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-log" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.199152 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-api" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.199175 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" containerName="nova-api-log" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.201599 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.205233 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.209007 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.228535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784cr\" (UniqueName: \"kubernetes.io/projected/a7b8c256-386b-4a72-aff0-09419fa4164c-kube-api-access-784cr\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.228630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-config-data\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.228855 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c256-386b-4a72-aff0-09419fa4164c-logs\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.228959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.238368 4735 scope.go:117] "RemoveContainer" containerID="233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5" Oct 08 13:17:52 crc kubenswrapper[4735]: E1008 13:17:52.238765 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5\": container with ID starting with 233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5 not found: ID does not exist" containerID="233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.238795 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5"} err="failed to get container status \"233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5\": rpc error: code = NotFound desc = could not find container \"233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5\": container with ID starting with 233ed0c3ce43ad7bb410dfceb620839af1ad100befe9f6caf2441f07c4c5e9f5 not found: ID does not exist" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.238820 4735 scope.go:117] "RemoveContainer" containerID="18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0" Oct 08 13:17:52 crc kubenswrapper[4735]: E1008 13:17:52.239018 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0\": container with ID starting with 18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0 not found: ID does not exist" containerID="18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.239048 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0"} err="failed to get container status \"18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0\": rpc error: code = NotFound desc = could not find container \"18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0\": container with ID starting with 18456d1abeed57d1da086fc444c382fe9513cd188f5bf35d712e3aa0181484e0 not found: ID does not exist" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.330935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784cr\" (UniqueName: \"kubernetes.io/projected/a7b8c256-386b-4a72-aff0-09419fa4164c-kube-api-access-784cr\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.330988 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-config-data\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.331082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c256-386b-4a72-aff0-09419fa4164c-logs\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.331119 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.333011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c256-386b-4a72-aff0-09419fa4164c-logs\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.336715 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-config-data\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.337080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.350706 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784cr\" (UniqueName: \"kubernetes.io/projected/a7b8c256-386b-4a72-aff0-09419fa4164c-kube-api-access-784cr\") pod \"nova-api-0\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.372008 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.432793 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6v7v\" (UniqueName: \"kubernetes.io/projected/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-kube-api-access-n6v7v\") pod \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.432879 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-combined-ca-bundle\") pod \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.432919 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-config-data\") pod \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\" (UID: \"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071\") " Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.436787 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-kube-api-access-n6v7v" (OuterVolumeSpecName: "kube-api-access-n6v7v") pod "7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" (UID: "7010eae7-51d7-4f0c-a4ca-a4fbbbd33071"). InnerVolumeSpecName "kube-api-access-n6v7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.459695 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" (UID: "7010eae7-51d7-4f0c-a4ca-a4fbbbd33071"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.461171 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-config-data" (OuterVolumeSpecName: "config-data") pod "7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" (UID: "7010eae7-51d7-4f0c-a4ca-a4fbbbd33071"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.524978 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.534849 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6v7v\" (UniqueName: \"kubernetes.io/projected/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-kube-api-access-n6v7v\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.535093 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.535186 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.904134 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74384674-a8ff-43a8-b776-b26ff882ee27" path="/var/lib/kubelet/pods/74384674-a8ff-43a8-b776-b26ff882ee27/volumes" Oct 08 13:17:52 crc kubenswrapper[4735]: I1008 13:17:52.971941 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:17:52 crc kubenswrapper[4735]: W1008 13:17:52.977665 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7b8c256_386b_4a72_aff0_09419fa4164c.slice/crio-bc7d52229442d3f81ff9a7a5648aaa0e92ecee95f4fd0f807a5586b9333ef8fe WatchSource:0}: Error finding container bc7d52229442d3f81ff9a7a5648aaa0e92ecee95f4fd0f807a5586b9333ef8fe: Status 404 returned error can't find the container with id bc7d52229442d3f81ff9a7a5648aaa0e92ecee95f4fd0f807a5586b9333ef8fe Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.127427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7010eae7-51d7-4f0c-a4ca-a4fbbbd33071","Type":"ContainerDied","Data":"4d2745920da8691f70efbd96e889a722dd8c29e609e5d94ccfcc2ed658e72255"} Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.127738 4735 scope.go:117] "RemoveContainer" containerID="4b7aa31a055dd58a9b888228bc73ddeafe15438c07ca586508b087a1f3f12295" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.127481 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.129374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7b8c256-386b-4a72-aff0-09419fa4164c","Type":"ContainerStarted","Data":"bc7d52229442d3f81ff9a7a5648aaa0e92ecee95f4fd0f807a5586b9333ef8fe"} Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.152892 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.172325 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.185985 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:53 crc kubenswrapper[4735]: E1008 13:17:53.186533 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" containerName="nova-scheduler-scheduler" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.186556 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" containerName="nova-scheduler-scheduler" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.186772 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" containerName="nova-scheduler-scheduler" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.187502 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.193334 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.194251 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.253879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.253978 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-config-data\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.254079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjczd\" (UniqueName: \"kubernetes.io/projected/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-kube-api-access-qjczd\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.356250 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjczd\" (UniqueName: \"kubernetes.io/projected/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-kube-api-access-qjczd\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.356364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.356421 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-config-data\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.360017 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-config-data\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.360443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.372994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjczd\" (UniqueName: \"kubernetes.io/projected/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-kube-api-access-qjczd\") pod \"nova-scheduler-0\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.490052 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.490103 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.512591 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:17:53 crc kubenswrapper[4735]: I1008 13:17:53.961210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:17:54 crc kubenswrapper[4735]: I1008 13:17:54.139720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601","Type":"ContainerStarted","Data":"863833f1f5aa88fa5ed67eb59c2783f052cf0f1af0a180c347006bbfa2246ce5"} Oct 08 13:17:54 crc kubenswrapper[4735]: I1008 13:17:54.141735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7b8c256-386b-4a72-aff0-09419fa4164c","Type":"ContainerStarted","Data":"a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d"} Oct 08 13:17:54 crc kubenswrapper[4735]: I1008 13:17:54.141804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7b8c256-386b-4a72-aff0-09419fa4164c","Type":"ContainerStarted","Data":"ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771"} Oct 08 13:17:54 crc kubenswrapper[4735]: I1008 13:17:54.158100 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.15808366 podStartE2EDuration="2.15808366s" podCreationTimestamp="2025-10-08 13:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:54.155837601 +0000 UTC m=+1323.964142145" watchObservedRunningTime="2025-10-08 13:17:54.15808366 +0000 UTC m=+1323.966388204" Oct 08 13:17:54 crc kubenswrapper[4735]: I1008 13:17:54.267376 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 13:17:54 crc kubenswrapper[4735]: I1008 13:17:54.907879 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7010eae7-51d7-4f0c-a4ca-a4fbbbd33071" path="/var/lib/kubelet/pods/7010eae7-51d7-4f0c-a4ca-a4fbbbd33071/volumes" Oct 08 13:17:55 crc kubenswrapper[4735]: I1008 13:17:55.155035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601","Type":"ContainerStarted","Data":"86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24"} Oct 08 13:17:55 crc kubenswrapper[4735]: I1008 13:17:55.174579 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.174557589 podStartE2EDuration="2.174557589s" podCreationTimestamp="2025-10-08 13:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:17:55.168865222 +0000 UTC m=+1324.977169776" watchObservedRunningTime="2025-10-08 13:17:55.174557589 +0000 UTC m=+1324.982862133" Oct 08 13:17:58 crc kubenswrapper[4735]: I1008 13:17:58.444906 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 13:17:58 crc kubenswrapper[4735]: I1008 13:17:58.490138 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 13:17:58 crc kubenswrapper[4735]: I1008 13:17:58.490197 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 13:17:58 crc kubenswrapper[4735]: I1008 13:17:58.513873 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 13:17:59 crc kubenswrapper[4735]: I1008 13:17:59.508803 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:17:59 crc kubenswrapper[4735]: I1008 13:17:59.508803 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:02 crc kubenswrapper[4735]: I1008 13:18:02.526701 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:18:02 crc kubenswrapper[4735]: I1008 13:18:02.527076 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:18:03 crc kubenswrapper[4735]: I1008 13:18:03.513782 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 13:18:03 crc kubenswrapper[4735]: I1008 13:18:03.548060 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 13:18:03 crc kubenswrapper[4735]: I1008 13:18:03.608693 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:03 crc kubenswrapper[4735]: I1008 13:18:03.608699 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:04 crc kubenswrapper[4735]: I1008 13:18:04.275823 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 13:18:08 crc kubenswrapper[4735]: I1008 13:18:08.495905 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 13:18:08 crc kubenswrapper[4735]: I1008 13:18:08.496419 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 13:18:08 crc kubenswrapper[4735]: I1008 13:18:08.501347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 13:18:08 crc kubenswrapper[4735]: I1008 13:18:08.504425 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.309500 4735 generic.go:334] "Generic (PLEG): container finished" podID="2861365a-b33e-4cf2-b2db-eb3e43c55535" containerID="f37619629254693188af3bf21fb4741a55f07777900aae9d0cdad72ae36cb38d" exitCode=137 Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.309909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2861365a-b33e-4cf2-b2db-eb3e43c55535","Type":"ContainerDied","Data":"f37619629254693188af3bf21fb4741a55f07777900aae9d0cdad72ae36cb38d"} Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.309941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2861365a-b33e-4cf2-b2db-eb3e43c55535","Type":"ContainerDied","Data":"9efef6c5858793fc76a4049ce3e9065d4a9178cdd0bbca1bf55dffe2c21046d5"} Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.309953 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efef6c5858793fc76a4049ce3e9065d4a9178cdd0bbca1bf55dffe2c21046d5" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.318549 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.473012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7964t\" (UniqueName: \"kubernetes.io/projected/2861365a-b33e-4cf2-b2db-eb3e43c55535-kube-api-access-7964t\") pod \"2861365a-b33e-4cf2-b2db-eb3e43c55535\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.473363 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-combined-ca-bundle\") pod \"2861365a-b33e-4cf2-b2db-eb3e43c55535\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.473393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-config-data\") pod \"2861365a-b33e-4cf2-b2db-eb3e43c55535\" (UID: \"2861365a-b33e-4cf2-b2db-eb3e43c55535\") " Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.478588 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2861365a-b33e-4cf2-b2db-eb3e43c55535-kube-api-access-7964t" (OuterVolumeSpecName: "kube-api-access-7964t") pod "2861365a-b33e-4cf2-b2db-eb3e43c55535" (UID: "2861365a-b33e-4cf2-b2db-eb3e43c55535"). InnerVolumeSpecName "kube-api-access-7964t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.505242 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2861365a-b33e-4cf2-b2db-eb3e43c55535" (UID: "2861365a-b33e-4cf2-b2db-eb3e43c55535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.506292 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-config-data" (OuterVolumeSpecName: "config-data") pod "2861365a-b33e-4cf2-b2db-eb3e43c55535" (UID: "2861365a-b33e-4cf2-b2db-eb3e43c55535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.575323 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7964t\" (UniqueName: \"kubernetes.io/projected/2861365a-b33e-4cf2-b2db-eb3e43c55535-kube-api-access-7964t\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.575350 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:11 crc kubenswrapper[4735]: I1008 13:18:11.575358 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2861365a-b33e-4cf2-b2db-eb3e43c55535-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.320681 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.358724 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.370279 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.400941 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:18:12 crc kubenswrapper[4735]: E1008 13:18:12.401353 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2861365a-b33e-4cf2-b2db-eb3e43c55535" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.401370 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2861365a-b33e-4cf2-b2db-eb3e43c55535" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.401673 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2861365a-b33e-4cf2-b2db-eb3e43c55535" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.402435 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.405427 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.405590 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.405442 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.418196 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.531320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.532385 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.536190 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.536776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.598545 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.599249 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrgf\" (UniqueName: \"kubernetes.io/projected/31465d06-bf56-4173-95c1-3c13d776b1f9-kube-api-access-jqrgf\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.600077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.600134 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.600174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.701224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrgf\" (UniqueName: \"kubernetes.io/projected/31465d06-bf56-4173-95c1-3c13d776b1f9-kube-api-access-jqrgf\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.701376 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.701412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.701440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.701504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.716015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.716060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.716118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.716390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.720042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrgf\" (UniqueName: \"kubernetes.io/projected/31465d06-bf56-4173-95c1-3c13d776b1f9-kube-api-access-jqrgf\") pod \"nova-cell1-novncproxy-0\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.725573 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:12 crc kubenswrapper[4735]: I1008 13:18:12.906662 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2861365a-b33e-4cf2-b2db-eb3e43c55535" path="/var/lib/kubelet/pods/2861365a-b33e-4cf2-b2db-eb3e43c55535/volumes" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.152494 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.332655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31465d06-bf56-4173-95c1-3c13d776b1f9","Type":"ContainerStarted","Data":"14e85f27e0b1cd9a858c207c3b1d54ff367ef60813c5473bf5978b40bd34aaf0"} Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.333023 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.339780 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.500273 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-7gpr6"] Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.514647 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.546298 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-7gpr6"] Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.616653 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-svc\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.616737 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.616840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-config\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.616888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf4b6\" (UniqueName: \"kubernetes.io/projected/1f333daa-559f-421a-b0a9-fa1de98b72cf-kube-api-access-vf4b6\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.617139 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.617235 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.718854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-svc\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.718903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.718947 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-config\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.718969 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf4b6\" (UniqueName: \"kubernetes.io/projected/1f333daa-559f-421a-b0a9-fa1de98b72cf-kube-api-access-vf4b6\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.719048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.719088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.720009 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.720176 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.721232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-config\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.721800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.721832 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-svc\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.738089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf4b6\" (UniqueName: \"kubernetes.io/projected/1f333daa-559f-421a-b0a9-fa1de98b72cf-kube-api-access-vf4b6\") pod \"dnsmasq-dns-6d4d96bb9-7gpr6\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:13 crc kubenswrapper[4735]: I1008 13:18:13.852731 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:14 crc kubenswrapper[4735]: I1008 13:18:14.328943 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-7gpr6"] Oct 08 13:18:14 crc kubenswrapper[4735]: I1008 13:18:14.345228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" event={"ID":"1f333daa-559f-421a-b0a9-fa1de98b72cf","Type":"ContainerStarted","Data":"708a497a7fe6cec61076ba788b81bcb2755d35d47993f250f92891d45b78b2b9"} Oct 08 13:18:14 crc kubenswrapper[4735]: I1008 13:18:14.352084 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31465d06-bf56-4173-95c1-3c13d776b1f9","Type":"ContainerStarted","Data":"e6bc8e996460d65d28494717119455e8c9a9619cda09617d493e57e6fb07bac3"} Oct 08 13:18:14 crc kubenswrapper[4735]: I1008 13:18:14.377381 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.377362163 podStartE2EDuration="2.377362163s" podCreationTimestamp="2025-10-08 13:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:14.376677325 +0000 UTC m=+1344.184981869" watchObservedRunningTime="2025-10-08 13:18:14.377362163 +0000 UTC m=+1344.185666707" Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.400232 4735 generic.go:334] "Generic (PLEG): container finished" podID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerID="badbfb090802af15f374dcb0a7e9ec38f6812bb6da7f93af0cc1dcf2371d9a2c" exitCode=0 Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.400338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" event={"ID":"1f333daa-559f-421a-b0a9-fa1de98b72cf","Type":"ContainerDied","Data":"badbfb090802af15f374dcb0a7e9ec38f6812bb6da7f93af0cc1dcf2371d9a2c"} Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.810967 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.811528 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-central-agent" containerID="cri-o://2d43f3b93ddd0f72a83e2fe39fafe892eb37bbe6b1005a8fe15e30558f6cb290" gracePeriod=30 Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.811650 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="proxy-httpd" containerID="cri-o://cac35ae518f55fdeecc2865584d2002921d9a9f5681e965ea39c2a3b4a2858db" gracePeriod=30 Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.811703 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="sg-core" containerID="cri-o://4a83db09d56bf29053706700fad48316d4233b68bd94c951e3a217ed1ef2b57e" gracePeriod=30 Oct 08 13:18:15 crc kubenswrapper[4735]: I1008 13:18:15.811736 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-notification-agent" containerID="cri-o://17739a3f7830b200456b6d2206c73fd49f8257d50ad7aa0a8c71b9123dea8c86" gracePeriod=30 Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.237227 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.412337 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" event={"ID":"1f333daa-559f-421a-b0a9-fa1de98b72cf","Type":"ContainerStarted","Data":"1c0e8f5f9cd61eb363c77155195f6cdd017ff1a8da9e953758ecbda931689b32"} Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415117 4735 generic.go:334] "Generic (PLEG): container finished" podID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerID="cac35ae518f55fdeecc2865584d2002921d9a9f5681e965ea39c2a3b4a2858db" exitCode=0 Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415144 4735 generic.go:334] "Generic (PLEG): container finished" podID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerID="4a83db09d56bf29053706700fad48316d4233b68bd94c951e3a217ed1ef2b57e" exitCode=2 Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415152 4735 generic.go:334] "Generic (PLEG): container finished" podID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerID="2d43f3b93ddd0f72a83e2fe39fafe892eb37bbe6b1005a8fe15e30558f6cb290" exitCode=0 Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerDied","Data":"cac35ae518f55fdeecc2865584d2002921d9a9f5681e965ea39c2a3b4a2858db"} Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415216 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerDied","Data":"4a83db09d56bf29053706700fad48316d4233b68bd94c951e3a217ed1ef2b57e"} Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerDied","Data":"2d43f3b93ddd0f72a83e2fe39fafe892eb37bbe6b1005a8fe15e30558f6cb290"} Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415340 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-log" containerID="cri-o://ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771" gracePeriod=30 Oct 08 13:18:16 crc kubenswrapper[4735]: I1008 13:18:16.415379 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-api" containerID="cri-o://a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d" gracePeriod=30 Oct 08 13:18:17 crc kubenswrapper[4735]: I1008 13:18:17.426762 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerID="ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771" exitCode=143 Oct 08 13:18:17 crc kubenswrapper[4735]: I1008 13:18:17.426849 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7b8c256-386b-4a72-aff0-09419fa4164c","Type":"ContainerDied","Data":"ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771"} Oct 08 13:18:17 crc kubenswrapper[4735]: I1008 13:18:17.427500 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:17 crc kubenswrapper[4735]: I1008 13:18:17.726279 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.447607 4735 generic.go:334] "Generic (PLEG): container finished" podID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerID="17739a3f7830b200456b6d2206c73fd49f8257d50ad7aa0a8c71b9123dea8c86" exitCode=0 Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.448169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerDied","Data":"17739a3f7830b200456b6d2206c73fd49f8257d50ad7aa0a8c71b9123dea8c86"} Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.712943 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742414 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vcft\" (UniqueName: \"kubernetes.io/projected/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-kube-api-access-6vcft\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742535 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-log-httpd\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742625 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-run-httpd\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742651 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-scripts\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742690 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-sg-core-conf-yaml\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742719 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-ceilometer-tls-certs\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742840 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-config-data\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.742885 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-combined-ca-bundle\") pod \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\" (UID: \"f8b20f18-fb68-45d8-9b04-c0c4096c44a7\") " Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.748597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.748834 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.753572 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-scripts" (OuterVolumeSpecName: "scripts") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.757146 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-kube-api-access-6vcft" (OuterVolumeSpecName: "kube-api-access-6vcft") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "kube-api-access-6vcft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.775286 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" podStartSLOduration=6.77522254 podStartE2EDuration="6.77522254s" podCreationTimestamp="2025-10-08 13:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:16.431925423 +0000 UTC m=+1346.240229967" watchObservedRunningTime="2025-10-08 13:18:19.77522254 +0000 UTC m=+1349.583527104" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.777932 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.816585 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.833543 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845625 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845674 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845685 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845694 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vcft\" (UniqueName: \"kubernetes.io/projected/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-kube-api-access-6vcft\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845705 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845728 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.845736 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.854655 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-config-data" (OuterVolumeSpecName: "config-data") pod "f8b20f18-fb68-45d8-9b04-c0c4096c44a7" (UID: "f8b20f18-fb68-45d8-9b04-c0c4096c44a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.947660 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b20f18-fb68-45d8-9b04-c0c4096c44a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:19 crc kubenswrapper[4735]: I1008 13:18:19.980646 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.052150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7b8c256-386b-4a72-aff0-09419fa4164c-logs" (OuterVolumeSpecName: "logs") pod "a7b8c256-386b-4a72-aff0-09419fa4164c" (UID: "a7b8c256-386b-4a72-aff0-09419fa4164c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.052515 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c256-386b-4a72-aff0-09419fa4164c-logs\") pod \"a7b8c256-386b-4a72-aff0-09419fa4164c\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.052803 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784cr\" (UniqueName: \"kubernetes.io/projected/a7b8c256-386b-4a72-aff0-09419fa4164c-kube-api-access-784cr\") pod \"a7b8c256-386b-4a72-aff0-09419fa4164c\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.052864 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-config-data\") pod \"a7b8c256-386b-4a72-aff0-09419fa4164c\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.052898 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-combined-ca-bundle\") pod \"a7b8c256-386b-4a72-aff0-09419fa4164c\" (UID: \"a7b8c256-386b-4a72-aff0-09419fa4164c\") " Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.053389 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c256-386b-4a72-aff0-09419fa4164c-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.056534 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b8c256-386b-4a72-aff0-09419fa4164c-kube-api-access-784cr" (OuterVolumeSpecName: "kube-api-access-784cr") pod "a7b8c256-386b-4a72-aff0-09419fa4164c" (UID: "a7b8c256-386b-4a72-aff0-09419fa4164c"). InnerVolumeSpecName "kube-api-access-784cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.089707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-config-data" (OuterVolumeSpecName: "config-data") pod "a7b8c256-386b-4a72-aff0-09419fa4164c" (UID: "a7b8c256-386b-4a72-aff0-09419fa4164c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.089726 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7b8c256-386b-4a72-aff0-09419fa4164c" (UID: "a7b8c256-386b-4a72-aff0-09419fa4164c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.155016 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784cr\" (UniqueName: \"kubernetes.io/projected/a7b8c256-386b-4a72-aff0-09419fa4164c-kube-api-access-784cr\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.155049 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.155059 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8c256-386b-4a72-aff0-09419fa4164c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.459309 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerID="a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d" exitCode=0 Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.459375 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7b8c256-386b-4a72-aff0-09419fa4164c","Type":"ContainerDied","Data":"a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d"} Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.459408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7b8c256-386b-4a72-aff0-09419fa4164c","Type":"ContainerDied","Data":"bc7d52229442d3f81ff9a7a5648aaa0e92ecee95f4fd0f807a5586b9333ef8fe"} Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.459429 4735 scope.go:117] "RemoveContainer" containerID="a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.459374 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.463763 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8b20f18-fb68-45d8-9b04-c0c4096c44a7","Type":"ContainerDied","Data":"c44edb0bb8e63573b8d7cdc226dd53661403e796ef260b0aff9cd9dd5f9ec7f0"} Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.463831 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.480704 4735 scope.go:117] "RemoveContainer" containerID="ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.505670 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.513704 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.516701 4735 scope.go:117] "RemoveContainer" containerID="a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.517177 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d\": container with ID starting with a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d not found: ID does not exist" containerID="a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.517294 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d"} err="failed to get container status \"a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d\": rpc error: code = NotFound desc = could not find container \"a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d\": container with ID starting with a716b1f09502b2caaaeaff8020cc838aa020c5e3e5708cb1585f2bd030595d2d not found: ID does not exist" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.517402 4735 scope.go:117] "RemoveContainer" containerID="ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.517822 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771\": container with ID starting with ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771 not found: ID does not exist" containerID="ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.517868 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771"} err="failed to get container status \"ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771\": rpc error: code = NotFound desc = could not find container \"ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771\": container with ID starting with ac2332e341745ddc4c518e37a7e99e74c529e81162a40586f73c374e55e84771 not found: ID does not exist" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.517891 4735 scope.go:117] "RemoveContainer" containerID="cac35ae518f55fdeecc2865584d2002921d9a9f5681e965ea39c2a3b4a2858db" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.521972 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.537401 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.549560 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.550046 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="proxy-httpd" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550067 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="proxy-httpd" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.550098 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-notification-agent" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550106 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-notification-agent" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.550127 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="sg-core" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550135 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="sg-core" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.550157 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-central-agent" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550165 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-central-agent" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.550176 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-api" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550183 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-api" Oct 08 13:18:20 crc kubenswrapper[4735]: E1008 13:18:20.550199 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-log" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550207 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-log" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550422 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-log" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550443 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" containerName="nova-api-api" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550460 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="sg-core" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550491 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="proxy-httpd" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550505 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-central-agent" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.550518 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" containerName="ceilometer-notification-agent" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.551772 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.556904 4735 scope.go:117] "RemoveContainer" containerID="4a83db09d56bf29053706700fad48316d4233b68bd94c951e3a217ed1ef2b57e" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.557379 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.571003 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.571266 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.577119 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.580049 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.585106 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.585310 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.588548 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.589776 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.600834 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.625548 4735 scope.go:117] "RemoveContainer" containerID="17739a3f7830b200456b6d2206c73fd49f8257d50ad7aa0a8c71b9123dea8c86" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.651655 4735 scope.go:117] "RemoveContainer" containerID="2d43f3b93ddd0f72a83e2fe39fafe892eb37bbe6b1005a8fe15e30558f6cb290" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.667841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.667892 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-config-data\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.667935 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-scripts\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.667957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668056 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-logs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-public-tls-certs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668119 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52dqk\" (UniqueName: \"kubernetes.io/projected/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-kube-api-access-52dqk\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x27q7\" (UniqueName: \"kubernetes.io/projected/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-kube-api-access-x27q7\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-run-httpd\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-log-httpd\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.668259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-config-data\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-logs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-public-tls-certs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769415 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52dqk\" (UniqueName: \"kubernetes.io/projected/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-kube-api-access-52dqk\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769496 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x27q7\" (UniqueName: \"kubernetes.io/projected/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-kube-api-access-x27q7\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769520 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-run-httpd\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-log-httpd\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-config-data\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-config-data\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769744 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-scripts\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769803 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.769869 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-logs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.770281 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-log-httpd\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.770697 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-run-httpd\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.774703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.775860 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.776113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-scripts\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.776199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.777162 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-public-tls-certs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.777340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-config-data\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.778083 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-config-data\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.778592 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.781147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.785658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x27q7\" (UniqueName: \"kubernetes.io/projected/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-kube-api-access-x27q7\") pod \"nova-api-0\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.794080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52dqk\" (UniqueName: \"kubernetes.io/projected/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-kube-api-access-52dqk\") pod \"ceilometer-0\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.906527 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.908169 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b8c256-386b-4a72-aff0-09419fa4164c" path="/var/lib/kubelet/pods/a7b8c256-386b-4a72-aff0-09419fa4164c/volumes" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.908675 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:20 crc kubenswrapper[4735]: I1008 13:18:20.909045 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b20f18-fb68-45d8-9b04-c0c4096c44a7" path="/var/lib/kubelet/pods/f8b20f18-fb68-45d8-9b04-c0c4096c44a7/volumes" Oct 08 13:18:21 crc kubenswrapper[4735]: I1008 13:18:21.212562 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:21 crc kubenswrapper[4735]: I1008 13:18:21.459220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:18:21 crc kubenswrapper[4735]: W1008 13:18:21.464099 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1303e4_c7b5_456a_aba6_c4674b8bcfed.slice/crio-7668d77282c181473f5eb368ac24c8d61b06b230a0b7cf3fc2ba2ce2e0efa3dc WatchSource:0}: Error finding container 7668d77282c181473f5eb368ac24c8d61b06b230a0b7cf3fc2ba2ce2e0efa3dc: Status 404 returned error can't find the container with id 7668d77282c181473f5eb368ac24c8d61b06b230a0b7cf3fc2ba2ce2e0efa3dc Oct 08 13:18:21 crc kubenswrapper[4735]: I1008 13:18:21.485369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerStarted","Data":"7668d77282c181473f5eb368ac24c8d61b06b230a0b7cf3fc2ba2ce2e0efa3dc"} Oct 08 13:18:21 crc kubenswrapper[4735]: I1008 13:18:21.488504 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe","Type":"ContainerStarted","Data":"e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb"} Oct 08 13:18:21 crc kubenswrapper[4735]: I1008 13:18:21.488758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe","Type":"ContainerStarted","Data":"417941cd093a72fd5988a3e533aab8fa98ac01abe398bc124070235570402db8"} Oct 08 13:18:22 crc kubenswrapper[4735]: I1008 13:18:22.503278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe","Type":"ContainerStarted","Data":"b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77"} Oct 08 13:18:22 crc kubenswrapper[4735]: I1008 13:18:22.504879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerStarted","Data":"0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8"} Oct 08 13:18:22 crc kubenswrapper[4735]: I1008 13:18:22.523686 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5236668719999997 podStartE2EDuration="2.523666872s" podCreationTimestamp="2025-10-08 13:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:22.523148788 +0000 UTC m=+1352.331453332" watchObservedRunningTime="2025-10-08 13:18:22.523666872 +0000 UTC m=+1352.331971416" Oct 08 13:18:22 crc kubenswrapper[4735]: I1008 13:18:22.726238 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:22 crc kubenswrapper[4735]: I1008 13:18:22.743845 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.515686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerStarted","Data":"b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a"} Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.535010 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.723365 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zkbts"] Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.724739 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.729096 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.729535 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.733254 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zkbts"] Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.821436 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.821517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-scripts\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.821565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-config-data\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.821636 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssm29\" (UniqueName: \"kubernetes.io/projected/7386076f-50b6-4cc9-b774-da32d6add528-kube-api-access-ssm29\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.855677 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.910006 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-6wrx7"] Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.910273 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerName="dnsmasq-dns" containerID="cri-o://df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b" gracePeriod=10 Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.922902 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-scripts\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.922958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-config-data\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.922994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssm29\" (UniqueName: \"kubernetes.io/projected/7386076f-50b6-4cc9-b774-da32d6add528-kube-api-access-ssm29\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.923137 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.928990 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-scripts\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.932874 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-config-data\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.933791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:23 crc kubenswrapper[4735]: I1008 13:18:23.948183 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssm29\" (UniqueName: \"kubernetes.io/projected/7386076f-50b6-4cc9-b774-da32d6add528-kube-api-access-ssm29\") pod \"nova-cell1-cell-mapping-zkbts\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.162929 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.389911 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.436748 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms5lf\" (UniqueName: \"kubernetes.io/projected/0f83be1f-9107-43fc-afad-d20a16cc0911-kube-api-access-ms5lf\") pod \"0f83be1f-9107-43fc-afad-d20a16cc0911\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.436863 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-nb\") pod \"0f83be1f-9107-43fc-afad-d20a16cc0911\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.436957 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-swift-storage-0\") pod \"0f83be1f-9107-43fc-afad-d20a16cc0911\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.437045 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-svc\") pod \"0f83be1f-9107-43fc-afad-d20a16cc0911\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.437174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-sb\") pod \"0f83be1f-9107-43fc-afad-d20a16cc0911\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.437865 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-config\") pod \"0f83be1f-9107-43fc-afad-d20a16cc0911\" (UID: \"0f83be1f-9107-43fc-afad-d20a16cc0911\") " Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.447209 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f83be1f-9107-43fc-afad-d20a16cc0911-kube-api-access-ms5lf" (OuterVolumeSpecName: "kube-api-access-ms5lf") pod "0f83be1f-9107-43fc-afad-d20a16cc0911" (UID: "0f83be1f-9107-43fc-afad-d20a16cc0911"). InnerVolumeSpecName "kube-api-access-ms5lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.504961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f83be1f-9107-43fc-afad-d20a16cc0911" (UID: "0f83be1f-9107-43fc-afad-d20a16cc0911"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.511841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-config" (OuterVolumeSpecName: "config") pod "0f83be1f-9107-43fc-afad-d20a16cc0911" (UID: "0f83be1f-9107-43fc-afad-d20a16cc0911"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.518901 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f83be1f-9107-43fc-afad-d20a16cc0911" (UID: "0f83be1f-9107-43fc-afad-d20a16cc0911"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.536280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0f83be1f-9107-43fc-afad-d20a16cc0911" (UID: "0f83be1f-9107-43fc-afad-d20a16cc0911"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.540594 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.540622 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms5lf\" (UniqueName: \"kubernetes.io/projected/0f83be1f-9107-43fc-afad-d20a16cc0911-kube-api-access-ms5lf\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.540638 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.540647 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.540656 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.541582 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerID="df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b" exitCode=0 Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.541709 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" event={"ID":"0f83be1f-9107-43fc-afad-d20a16cc0911","Type":"ContainerDied","Data":"df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b"} Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.541748 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" event={"ID":"0f83be1f-9107-43fc-afad-d20a16cc0911","Type":"ContainerDied","Data":"38d913cdaaf70c0d6d56fa0be0bb952435b488307b5cebfe9d448eeb359031e3"} Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.541772 4735 scope.go:117] "RemoveContainer" containerID="df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.541982 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-6wrx7" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.546667 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f83be1f-9107-43fc-afad-d20a16cc0911" (UID: "0f83be1f-9107-43fc-afad-d20a16cc0911"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.555253 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerStarted","Data":"8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf"} Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.579446 4735 scope.go:117] "RemoveContainer" containerID="777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.626467 4735 scope.go:117] "RemoveContainer" containerID="df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b" Oct 08 13:18:24 crc kubenswrapper[4735]: E1008 13:18:24.627552 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b\": container with ID starting with df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b not found: ID does not exist" containerID="df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.627606 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b"} err="failed to get container status \"df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b\": rpc error: code = NotFound desc = could not find container \"df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b\": container with ID starting with df3cee6af5ec9b69328079e9be4776e7e7a521bb2c19df39ae8793d8d6f34c6b not found: ID does not exist" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.627646 4735 scope.go:117] "RemoveContainer" containerID="777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c" Oct 08 13:18:24 crc kubenswrapper[4735]: E1008 13:18:24.628102 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c\": container with ID starting with 777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c not found: ID does not exist" containerID="777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.628164 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c"} err="failed to get container status \"777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c\": rpc error: code = NotFound desc = could not find container \"777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c\": container with ID starting with 777011447d701c38d307c6ae1785e771e9b66b3172113a6d6955d2f32dd6037c not found: ID does not exist" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.643420 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f83be1f-9107-43fc-afad-d20a16cc0911-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.725376 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zkbts"] Oct 08 13:18:24 crc kubenswrapper[4735]: W1008 13:18:24.728749 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7386076f_50b6_4cc9_b774_da32d6add528.slice/crio-d7b735058c44fdab869731bdca4706037abf4b0ddd34be7d6a6fce33e6ffba24 WatchSource:0}: Error finding container d7b735058c44fdab869731bdca4706037abf4b0ddd34be7d6a6fce33e6ffba24: Status 404 returned error can't find the container with id d7b735058c44fdab869731bdca4706037abf4b0ddd34be7d6a6fce33e6ffba24 Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.925669 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-6wrx7"] Oct 08 13:18:24 crc kubenswrapper[4735]: I1008 13:18:24.933528 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-6wrx7"] Oct 08 13:18:25 crc kubenswrapper[4735]: I1008 13:18:25.567103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zkbts" event={"ID":"7386076f-50b6-4cc9-b774-da32d6add528","Type":"ContainerStarted","Data":"4687666d9fb769e55d9e2662909018e3d99681ac811ca1ac37fcba325c3e95ca"} Oct 08 13:18:25 crc kubenswrapper[4735]: I1008 13:18:25.567434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zkbts" event={"ID":"7386076f-50b6-4cc9-b774-da32d6add528","Type":"ContainerStarted","Data":"d7b735058c44fdab869731bdca4706037abf4b0ddd34be7d6a6fce33e6ffba24"} Oct 08 13:18:25 crc kubenswrapper[4735]: I1008 13:18:25.589385 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zkbts" podStartSLOduration=2.589368855 podStartE2EDuration="2.589368855s" podCreationTimestamp="2025-10-08 13:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:25.581037769 +0000 UTC m=+1355.389342323" watchObservedRunningTime="2025-10-08 13:18:25.589368855 +0000 UTC m=+1355.397673399" Oct 08 13:18:26 crc kubenswrapper[4735]: I1008 13:18:26.583183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerStarted","Data":"ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523"} Oct 08 13:18:26 crc kubenswrapper[4735]: I1008 13:18:26.583646 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 13:18:26 crc kubenswrapper[4735]: I1008 13:18:26.616725 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.72194088 podStartE2EDuration="6.616698026s" podCreationTimestamp="2025-10-08 13:18:20 +0000 UTC" firstStartedPulling="2025-10-08 13:18:21.469033563 +0000 UTC m=+1351.277338107" lastFinishedPulling="2025-10-08 13:18:25.363790709 +0000 UTC m=+1355.172095253" observedRunningTime="2025-10-08 13:18:26.607094568 +0000 UTC m=+1356.415399142" watchObservedRunningTime="2025-10-08 13:18:26.616698026 +0000 UTC m=+1356.425002590" Oct 08 13:18:26 crc kubenswrapper[4735]: I1008 13:18:26.910059 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" path="/var/lib/kubelet/pods/0f83be1f-9107-43fc-afad-d20a16cc0911/volumes" Oct 08 13:18:30 crc kubenswrapper[4735]: I1008 13:18:30.621098 4735 generic.go:334] "Generic (PLEG): container finished" podID="7386076f-50b6-4cc9-b774-da32d6add528" containerID="4687666d9fb769e55d9e2662909018e3d99681ac811ca1ac37fcba325c3e95ca" exitCode=0 Oct 08 13:18:30 crc kubenswrapper[4735]: I1008 13:18:30.621187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zkbts" event={"ID":"7386076f-50b6-4cc9-b774-da32d6add528","Type":"ContainerDied","Data":"4687666d9fb769e55d9e2662909018e3d99681ac811ca1ac37fcba325c3e95ca"} Oct 08 13:18:30 crc kubenswrapper[4735]: I1008 13:18:30.908880 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:18:30 crc kubenswrapper[4735]: I1008 13:18:30.908945 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:18:31 crc kubenswrapper[4735]: I1008 13:18:31.921646 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:31 crc kubenswrapper[4735]: I1008 13:18:31.922117 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.011994 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.183539 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-scripts\") pod \"7386076f-50b6-4cc9-b774-da32d6add528\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.183764 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssm29\" (UniqueName: \"kubernetes.io/projected/7386076f-50b6-4cc9-b774-da32d6add528-kube-api-access-ssm29\") pod \"7386076f-50b6-4cc9-b774-da32d6add528\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.183799 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-config-data\") pod \"7386076f-50b6-4cc9-b774-da32d6add528\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.183856 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-combined-ca-bundle\") pod \"7386076f-50b6-4cc9-b774-da32d6add528\" (UID: \"7386076f-50b6-4cc9-b774-da32d6add528\") " Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.189613 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-scripts" (OuterVolumeSpecName: "scripts") pod "7386076f-50b6-4cc9-b774-da32d6add528" (UID: "7386076f-50b6-4cc9-b774-da32d6add528"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.192691 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7386076f-50b6-4cc9-b774-da32d6add528-kube-api-access-ssm29" (OuterVolumeSpecName: "kube-api-access-ssm29") pod "7386076f-50b6-4cc9-b774-da32d6add528" (UID: "7386076f-50b6-4cc9-b774-da32d6add528"). InnerVolumeSpecName "kube-api-access-ssm29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.215824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7386076f-50b6-4cc9-b774-da32d6add528" (UID: "7386076f-50b6-4cc9-b774-da32d6add528"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.223000 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-config-data" (OuterVolumeSpecName: "config-data") pod "7386076f-50b6-4cc9-b774-da32d6add528" (UID: "7386076f-50b6-4cc9-b774-da32d6add528"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.287243 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.287307 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.287322 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssm29\" (UniqueName: \"kubernetes.io/projected/7386076f-50b6-4cc9-b774-da32d6add528-kube-api-access-ssm29\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.287358 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7386076f-50b6-4cc9-b774-da32d6add528-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.660249 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zkbts" event={"ID":"7386076f-50b6-4cc9-b774-da32d6add528","Type":"ContainerDied","Data":"d7b735058c44fdab869731bdca4706037abf4b0ddd34be7d6a6fce33e6ffba24"} Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.660300 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b735058c44fdab869731bdca4706037abf4b0ddd34be7d6a6fce33e6ffba24" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.660401 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zkbts" Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.826592 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.826925 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-api" containerID="cri-o://b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77" gracePeriod=30 Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.827022 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-log" containerID="cri-o://e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb" gracePeriod=30 Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.837035 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.837259 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" containerName="nova-scheduler-scheduler" containerID="cri-o://86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" gracePeriod=30 Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.917297 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.917594 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-log" containerID="cri-o://54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd" gracePeriod=30 Oct 08 13:18:32 crc kubenswrapper[4735]: I1008 13:18:32.917658 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-metadata" containerID="cri-o://1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477" gracePeriod=30 Oct 08 13:18:33 crc kubenswrapper[4735]: E1008 13:18:33.516466 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:18:33 crc kubenswrapper[4735]: E1008 13:18:33.518795 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:18:33 crc kubenswrapper[4735]: E1008 13:18:33.520342 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:18:33 crc kubenswrapper[4735]: E1008 13:18:33.520444 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" containerName="nova-scheduler-scheduler" Oct 08 13:18:33 crc kubenswrapper[4735]: I1008 13:18:33.671399 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerID="54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd" exitCode=143 Oct 08 13:18:33 crc kubenswrapper[4735]: I1008 13:18:33.671526 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee128310-e5f8-4197-9c56-caa724cd15fc","Type":"ContainerDied","Data":"54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd"} Oct 08 13:18:33 crc kubenswrapper[4735]: I1008 13:18:33.674286 4735 generic.go:334] "Generic (PLEG): container finished" podID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerID="e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb" exitCode=143 Oct 08 13:18:33 crc kubenswrapper[4735]: I1008 13:18:33.674345 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe","Type":"ContainerDied","Data":"e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb"} Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.052357 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:60280->10.217.0.192:8775: read: connection reset by peer" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.052680 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:60284->10.217.0.192:8775: read: connection reset by peer" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.509619 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.678501 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-nova-metadata-tls-certs\") pod \"ee128310-e5f8-4197-9c56-caa724cd15fc\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.678606 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzj76\" (UniqueName: \"kubernetes.io/projected/ee128310-e5f8-4197-9c56-caa724cd15fc-kube-api-access-rzj76\") pod \"ee128310-e5f8-4197-9c56-caa724cd15fc\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.678693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-config-data\") pod \"ee128310-e5f8-4197-9c56-caa724cd15fc\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.678749 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-combined-ca-bundle\") pod \"ee128310-e5f8-4197-9c56-caa724cd15fc\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.678835 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee128310-e5f8-4197-9c56-caa724cd15fc-logs\") pod \"ee128310-e5f8-4197-9c56-caa724cd15fc\" (UID: \"ee128310-e5f8-4197-9c56-caa724cd15fc\") " Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.679579 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee128310-e5f8-4197-9c56-caa724cd15fc-logs" (OuterVolumeSpecName: "logs") pod "ee128310-e5f8-4197-9c56-caa724cd15fc" (UID: "ee128310-e5f8-4197-9c56-caa724cd15fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.685704 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee128310-e5f8-4197-9c56-caa724cd15fc-kube-api-access-rzj76" (OuterVolumeSpecName: "kube-api-access-rzj76") pod "ee128310-e5f8-4197-9c56-caa724cd15fc" (UID: "ee128310-e5f8-4197-9c56-caa724cd15fc"). InnerVolumeSpecName "kube-api-access-rzj76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.702309 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerID="1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477" exitCode=0 Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.702351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee128310-e5f8-4197-9c56-caa724cd15fc","Type":"ContainerDied","Data":"1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477"} Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.702378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee128310-e5f8-4197-9c56-caa724cd15fc","Type":"ContainerDied","Data":"927fec1187ffc26cdbbf7cc3ae7df8bb9395ae0b402be62c7c3fc4d0a94b295e"} Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.702393 4735 scope.go:117] "RemoveContainer" containerID="1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.702517 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.720257 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-config-data" (OuterVolumeSpecName: "config-data") pod "ee128310-e5f8-4197-9c56-caa724cd15fc" (UID: "ee128310-e5f8-4197-9c56-caa724cd15fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.724525 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee128310-e5f8-4197-9c56-caa724cd15fc" (UID: "ee128310-e5f8-4197-9c56-caa724cd15fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.731677 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ee128310-e5f8-4197-9c56-caa724cd15fc" (UID: "ee128310-e5f8-4197-9c56-caa724cd15fc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.781545 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee128310-e5f8-4197-9c56-caa724cd15fc-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.781590 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.781600 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzj76\" (UniqueName: \"kubernetes.io/projected/ee128310-e5f8-4197-9c56-caa724cd15fc-kube-api-access-rzj76\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.781611 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.781622 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee128310-e5f8-4197-9c56-caa724cd15fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.790643 4735 scope.go:117] "RemoveContainer" containerID="54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.818076 4735 scope.go:117] "RemoveContainer" containerID="1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477" Oct 08 13:18:36 crc kubenswrapper[4735]: E1008 13:18:36.818686 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477\": container with ID starting with 1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477 not found: ID does not exist" containerID="1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.818735 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477"} err="failed to get container status \"1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477\": rpc error: code = NotFound desc = could not find container \"1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477\": container with ID starting with 1c20037054f16163cc833fef0b60054bd2fbb164d84491c280e121742f97f477 not found: ID does not exist" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.818775 4735 scope.go:117] "RemoveContainer" containerID="54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd" Oct 08 13:18:36 crc kubenswrapper[4735]: E1008 13:18:36.819227 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd\": container with ID starting with 54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd not found: ID does not exist" containerID="54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd" Oct 08 13:18:36 crc kubenswrapper[4735]: I1008 13:18:36.819276 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd"} err="failed to get container status \"54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd\": rpc error: code = NotFound desc = could not find container \"54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd\": container with ID starting with 54a9922b05de266a075643e7f846beead00a52ac35249d6b83aad5904de910fd not found: ID does not exist" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.032092 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.041008 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.050771 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.051206 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7386076f-50b6-4cc9-b774-da32d6add528" containerName="nova-manage" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051232 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7386076f-50b6-4cc9-b774-da32d6add528" containerName="nova-manage" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.051248 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-metadata" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051254 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-metadata" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.051276 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerName="init" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051282 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerName="init" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.051296 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-log" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051301 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-log" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.051312 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerName="dnsmasq-dns" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051317 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerName="dnsmasq-dns" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051550 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f83be1f-9107-43fc-afad-d20a16cc0911" containerName="dnsmasq-dns" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051574 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7386076f-50b6-4cc9-b774-da32d6add528" containerName="nova-manage" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051591 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-log" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.051612 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" containerName="nova-metadata-metadata" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.052874 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.055438 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.058693 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.082884 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.192962 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-config-data\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.193041 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.193122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5k48\" (UniqueName: \"kubernetes.io/projected/1db01523-cd76-4a9d-8063-e024dd919879-kube-api-access-g5k48\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.193200 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.193225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db01523-cd76-4a9d-8063-e024dd919879-logs\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.294727 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.295184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db01523-cd76-4a9d-8063-e024dd919879-logs\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.295364 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db01523-cd76-4a9d-8063-e024dd919879-logs\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.295391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-config-data\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.295499 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.295672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5k48\" (UniqueName: \"kubernetes.io/projected/1db01523-cd76-4a9d-8063-e024dd919879-kube-api-access-g5k48\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.300373 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-config-data\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.300637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.301284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.326695 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5k48\" (UniqueName: \"kubernetes.io/projected/1db01523-cd76-4a9d-8063-e024dd919879-kube-api-access-g5k48\") pod \"nova-metadata-0\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.376411 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.485031 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.601182 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-config-data\") pod \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.601727 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjczd\" (UniqueName: \"kubernetes.io/projected/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-kube-api-access-qjczd\") pod \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.601885 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-combined-ca-bundle\") pod \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\" (UID: \"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.609718 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-kube-api-access-qjczd" (OuterVolumeSpecName: "kube-api-access-qjczd") pod "ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" (UID: "ee510ab0-ef69-47d0-9b5b-6f8ae4b93601"). InnerVolumeSpecName "kube-api-access-qjczd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.627842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-config-data" (OuterVolumeSpecName: "config-data") pod "ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" (UID: "ee510ab0-ef69-47d0-9b5b-6f8ae4b93601"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.644695 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" (UID: "ee510ab0-ef69-47d0-9b5b-6f8ae4b93601"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.690421 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.705643 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.705701 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjczd\" (UniqueName: \"kubernetes.io/projected/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-kube-api-access-qjczd\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.705721 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.718721 4735 generic.go:334] "Generic (PLEG): container finished" podID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerID="b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77" exitCode=0 Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.718855 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.719693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe","Type":"ContainerDied","Data":"b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77"} Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.719730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe","Type":"ContainerDied","Data":"417941cd093a72fd5988a3e533aab8fa98ac01abe398bc124070235570402db8"} Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.719752 4735 scope.go:117] "RemoveContainer" containerID="b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.725101 4735 generic.go:334] "Generic (PLEG): container finished" podID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" exitCode=0 Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.725145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601","Type":"ContainerDied","Data":"86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24"} Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.725160 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.725176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee510ab0-ef69-47d0-9b5b-6f8ae4b93601","Type":"ContainerDied","Data":"863833f1f5aa88fa5ed67eb59c2783f052cf0f1af0a180c347006bbfa2246ce5"} Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.763792 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.771808 4735 scope.go:117] "RemoveContainer" containerID="e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.772046 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.788529 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.789100 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-api" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.789119 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-api" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.789146 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" containerName="nova-scheduler-scheduler" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.789152 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" containerName="nova-scheduler-scheduler" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.789159 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-log" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.789165 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-log" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.789330 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" containerName="nova-scheduler-scheduler" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.789344 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-log" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.789365 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" containerName="nova-api-api" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.790012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.793272 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.797894 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.801509 4735 scope.go:117] "RemoveContainer" containerID="b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.804180 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77\": container with ID starting with b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77 not found: ID does not exist" containerID="b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.804245 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77"} err="failed to get container status \"b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77\": rpc error: code = NotFound desc = could not find container \"b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77\": container with ID starting with b23279fdc01384863e20083167d3f7bbbf83e132d82bd4920d347aecc7befe77 not found: ID does not exist" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.804269 4735 scope.go:117] "RemoveContainer" containerID="e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.806574 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-combined-ca-bundle\") pod \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.806626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-public-tls-certs\") pod \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.806717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-internal-tls-certs\") pod \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.806779 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-logs\") pod \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.806888 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x27q7\" (UniqueName: \"kubernetes.io/projected/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-kube-api-access-x27q7\") pod \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.807096 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb\": container with ID starting with e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb not found: ID does not exist" containerID="e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.807182 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb"} err="failed to get container status \"e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb\": rpc error: code = NotFound desc = could not find container \"e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb\": container with ID starting with e29d2b69411ea78730e6078de70640933c7ea07c84189ba074871bcf5f25c9bb not found: ID does not exist" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.807407 4735 scope.go:117] "RemoveContainer" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.807797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-config-data\") pod \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\" (UID: \"a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe\") " Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.810853 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-logs" (OuterVolumeSpecName: "logs") pod "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" (UID: "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.813265 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-kube-api-access-x27q7" (OuterVolumeSpecName: "kube-api-access-x27q7") pod "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" (UID: "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe"). InnerVolumeSpecName "kube-api-access-x27q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.829663 4735 scope.go:117] "RemoveContainer" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" Oct 08 13:18:37 crc kubenswrapper[4735]: E1008 13:18:37.830185 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24\": container with ID starting with 86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24 not found: ID does not exist" containerID="86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.830353 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24"} err="failed to get container status \"86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24\": rpc error: code = NotFound desc = could not find container \"86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24\": container with ID starting with 86ca0a7bab642bbd7a78023f6fc0a6dbf5894fee35b063fa337c0d9e9e762d24 not found: ID does not exist" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.834199 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-config-data" (OuterVolumeSpecName: "config-data") pod "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" (UID: "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.837013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" (UID: "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.856741 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" (UID: "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.868791 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" (UID: "a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:18:37 crc kubenswrapper[4735]: W1008 13:18:37.875975 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db01523_cd76_4a9d_8063_e024dd919879.slice/crio-9ae6450959e862d9763d81e9fae2191a7b6b4f716035130d03d9f88e6d1c3490 WatchSource:0}: Error finding container 9ae6450959e862d9763d81e9fae2191a7b6b4f716035130d03d9f88e6d1c3490: Status 404 returned error can't find the container with id 9ae6450959e862d9763d81e9fae2191a7b6b4f716035130d03d9f88e6d1c3490 Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.876328 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-config-data\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912265 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csk2\" (UniqueName: \"kubernetes.io/projected/53890bab-61cc-4f19-908a-33c085da8d33-kube-api-access-6csk2\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912681 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912714 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x27q7\" (UniqueName: \"kubernetes.io/projected/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-kube-api-access-x27q7\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912726 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912737 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912745 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:37 crc kubenswrapper[4735]: I1008 13:18:37.912753 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.014372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-config-data\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.014468 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.014611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6csk2\" (UniqueName: \"kubernetes.io/projected/53890bab-61cc-4f19-908a-33c085da8d33-kube-api-access-6csk2\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.019305 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.020540 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-config-data\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.032301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csk2\" (UniqueName: \"kubernetes.io/projected/53890bab-61cc-4f19-908a-33c085da8d33-kube-api-access-6csk2\") pod \"nova-scheduler-0\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.063684 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.080659 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.091647 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.093744 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.097512 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.097718 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.098790 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.118718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.119010 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.221948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.222036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.222090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-public-tls-certs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.222129 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-config-data\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.222217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82j5p\" (UniqueName: \"kubernetes.io/projected/97384a50-9f20-4cbf-8b02-afffe5ddc791-kube-api-access-82j5p\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.222318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97384a50-9f20-4cbf-8b02-afffe5ddc791-logs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.324469 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97384a50-9f20-4cbf-8b02-afffe5ddc791-logs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.324856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.324909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.324944 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-public-tls-certs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.324971 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-config-data\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.325017 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82j5p\" (UniqueName: \"kubernetes.io/projected/97384a50-9f20-4cbf-8b02-afffe5ddc791-kube-api-access-82j5p\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.325787 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97384a50-9f20-4cbf-8b02-afffe5ddc791-logs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.331223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.331249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-public-tls-certs\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.331314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.336212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-config-data\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.341044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82j5p\" (UniqueName: \"kubernetes.io/projected/97384a50-9f20-4cbf-8b02-afffe5ddc791-kube-api-access-82j5p\") pod \"nova-api-0\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.575860 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.577683 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.741313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1db01523-cd76-4a9d-8063-e024dd919879","Type":"ContainerStarted","Data":"418d5ac0c79b3bbb212fb99f2df15049ca6cc912c6272d8bf5adfab9f972e7cf"} Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.741745 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1db01523-cd76-4a9d-8063-e024dd919879","Type":"ContainerStarted","Data":"6fcbdbcd2a5b227d1b93c5af337c1732e6685ef4e33f54e3bddf01cee27e6906"} Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.741780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1db01523-cd76-4a9d-8063-e024dd919879","Type":"ContainerStarted","Data":"9ae6450959e862d9763d81e9fae2191a7b6b4f716035130d03d9f88e6d1c3490"} Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.746015 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53890bab-61cc-4f19-908a-33c085da8d33","Type":"ContainerStarted","Data":"45587d9115eeb1cbcf8295e2c9ae350f24824252a2ca6332e888f3b57d8471af"} Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.763526 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.7635086389999999 podStartE2EDuration="1.763508639s" podCreationTimestamp="2025-10-08 13:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:38.760106881 +0000 UTC m=+1368.568411425" watchObservedRunningTime="2025-10-08 13:18:38.763508639 +0000 UTC m=+1368.571813183" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.906164 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe" path="/var/lib/kubelet/pods/a5c49afc-36d6-4f1e-aeab-9b4acb9b7ffe/volumes" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.907139 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee128310-e5f8-4197-9c56-caa724cd15fc" path="/var/lib/kubelet/pods/ee128310-e5f8-4197-9c56-caa724cd15fc/volumes" Oct 08 13:18:38 crc kubenswrapper[4735]: I1008 13:18:38.907723 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee510ab0-ef69-47d0-9b5b-6f8ae4b93601" path="/var/lib/kubelet/pods/ee510ab0-ef69-47d0-9b5b-6f8ae4b93601/volumes" Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.023845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.763341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97384a50-9f20-4cbf-8b02-afffe5ddc791","Type":"ContainerStarted","Data":"d6ed228dfc5748efb601fafc99f157a758cb829f235af05f716ef363f89fcc14"} Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.764531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97384a50-9f20-4cbf-8b02-afffe5ddc791","Type":"ContainerStarted","Data":"205355fb6f24a0939888db6199eeaa319bf35bb6b065a98fc683f3a28c8fa854"} Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.764559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97384a50-9f20-4cbf-8b02-afffe5ddc791","Type":"ContainerStarted","Data":"059d46787e8c02976e8aea98ad1dfa228b5445ba55076e35881403183aae18e4"} Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.766279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53890bab-61cc-4f19-908a-33c085da8d33","Type":"ContainerStarted","Data":"7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49"} Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.787777 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.787758746 podStartE2EDuration="1.787758746s" podCreationTimestamp="2025-10-08 13:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:39.786866963 +0000 UTC m=+1369.595171527" watchObservedRunningTime="2025-10-08 13:18:39.787758746 +0000 UTC m=+1369.596063290" Oct 08 13:18:39 crc kubenswrapper[4735]: I1008 13:18:39.814927 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.814904201 podStartE2EDuration="2.814904201s" podCreationTimestamp="2025-10-08 13:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:18:39.806447991 +0000 UTC m=+1369.614752535" watchObservedRunningTime="2025-10-08 13:18:39.814904201 +0000 UTC m=+1369.623208745" Oct 08 13:18:42 crc kubenswrapper[4735]: I1008 13:18:42.376643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 13:18:42 crc kubenswrapper[4735]: I1008 13:18:42.377018 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 13:18:43 crc kubenswrapper[4735]: I1008 13:18:43.119151 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 13:18:47 crc kubenswrapper[4735]: I1008 13:18:47.376847 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 13:18:47 crc kubenswrapper[4735]: I1008 13:18:47.377359 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.119513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.148716 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.390766 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.390782 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.578665 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.578727 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 13:18:48 crc kubenswrapper[4735]: I1008 13:18:48.909696 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 13:18:49 crc kubenswrapper[4735]: I1008 13:18:49.589674 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:49 crc kubenswrapper[4735]: I1008 13:18:49.589677 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 13:18:50 crc kubenswrapper[4735]: I1008 13:18:50.948815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 13:18:57 crc kubenswrapper[4735]: I1008 13:18:57.382795 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 13:18:57 crc kubenswrapper[4735]: I1008 13:18:57.383412 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 13:18:57 crc kubenswrapper[4735]: I1008 13:18:57.389802 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 13:18:57 crc kubenswrapper[4735]: I1008 13:18:57.391648 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 13:18:58 crc kubenswrapper[4735]: I1008 13:18:58.587282 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 13:18:58 crc kubenswrapper[4735]: I1008 13:18:58.587973 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 13:18:58 crc kubenswrapper[4735]: I1008 13:18:58.588083 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 13:18:58 crc kubenswrapper[4735]: I1008 13:18:58.594807 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 13:18:58 crc kubenswrapper[4735]: I1008 13:18:58.939895 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 13:18:58 crc kubenswrapper[4735]: I1008 13:18:58.944963 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.549267 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-scqw7"] Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.551618 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.565162 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scqw7"] Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.733463 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-catalog-content\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.733564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljj5\" (UniqueName: \"kubernetes.io/projected/983bfc3d-43e8-4190-854d-c3e9497754ae-kube-api-access-nljj5\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.733937 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-utilities\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.835738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-utilities\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.835805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-catalog-content\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.835847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljj5\" (UniqueName: \"kubernetes.io/projected/983bfc3d-43e8-4190-854d-c3e9497754ae-kube-api-access-nljj5\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.836346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-utilities\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.836362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-catalog-content\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.857556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljj5\" (UniqueName: \"kubernetes.io/projected/983bfc3d-43e8-4190-854d-c3e9497754ae-kube-api-access-nljj5\") pod \"redhat-operators-scqw7\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:04 crc kubenswrapper[4735]: I1008 13:19:04.882503 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:05 crc kubenswrapper[4735]: I1008 13:19:05.341445 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scqw7"] Oct 08 13:19:06 crc kubenswrapper[4735]: I1008 13:19:06.030078 4735 generic.go:334] "Generic (PLEG): container finished" podID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerID="77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf" exitCode=0 Oct 08 13:19:06 crc kubenswrapper[4735]: I1008 13:19:06.030146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerDied","Data":"77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf"} Oct 08 13:19:06 crc kubenswrapper[4735]: I1008 13:19:06.030383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerStarted","Data":"83bbfe9e7bb0bd72ec8c4cc7813c747f2b038c3e4781ed72b0ba4adfbe289405"} Oct 08 13:19:06 crc kubenswrapper[4735]: I1008 13:19:06.032036 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:19:08 crc kubenswrapper[4735]: I1008 13:19:08.067486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerStarted","Data":"48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0"} Oct 08 13:19:09 crc kubenswrapper[4735]: I1008 13:19:09.079916 4735 generic.go:334] "Generic (PLEG): container finished" podID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerID="48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0" exitCode=0 Oct 08 13:19:09 crc kubenswrapper[4735]: I1008 13:19:09.079966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerDied","Data":"48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0"} Oct 08 13:19:10 crc kubenswrapper[4735]: I1008 13:19:10.095919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerStarted","Data":"efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801"} Oct 08 13:19:10 crc kubenswrapper[4735]: I1008 13:19:10.113721 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-scqw7" podStartSLOduration=2.393749285 podStartE2EDuration="6.113699464s" podCreationTimestamp="2025-10-08 13:19:04 +0000 UTC" firstStartedPulling="2025-10-08 13:19:06.031767509 +0000 UTC m=+1395.840072053" lastFinishedPulling="2025-10-08 13:19:09.751717688 +0000 UTC m=+1399.560022232" observedRunningTime="2025-10-08 13:19:10.112725149 +0000 UTC m=+1399.921029713" watchObservedRunningTime="2025-10-08 13:19:10.113699464 +0000 UTC m=+1399.922004008" Oct 08 13:19:14 crc kubenswrapper[4735]: I1008 13:19:14.883468 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:14 crc kubenswrapper[4735]: I1008 13:19:14.884058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:15 crc kubenswrapper[4735]: I1008 13:19:15.932394 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-scqw7" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" probeResult="failure" output=< Oct 08 13:19:15 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 13:19:15 crc kubenswrapper[4735]: > Oct 08 13:19:17 crc kubenswrapper[4735]: I1008 13:19:17.962553 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 13:19:17 crc kubenswrapper[4735]: I1008 13:19:17.963030 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="27af208c-711b-4423-bae1-91b55e84e14a" containerName="openstackclient" containerID="cri-o://ce728a0dd44fff56f85bc83d0ecfb0de2a06e0305143dfb8dc7aa43bf3d0dbe6" gracePeriod=2 Oct 08 13:19:17 crc kubenswrapper[4735]: I1008 13:19:17.983365 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.497182 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.523402 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.523653 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" containerID="cri-o://af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe" gracePeriod=30 Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.523689 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="openstack-network-exporter" containerID="cri-o://14207a6e2a9596d194069fc196983081842e33d540bef5d9124e59a802e7d234" gracePeriod=30 Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.558156 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-d25cz"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.575325 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-d25cz"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.596143 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron936c-account-delete-hq47l"] Oct 08 13:19:18 crc kubenswrapper[4735]: E1008 13:19:18.596781 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27af208c-711b-4423-bae1-91b55e84e14a" containerName="openstackclient" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.596803 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="27af208c-711b-4423-bae1-91b55e84e14a" containerName="openstackclient" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.597026 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="27af208c-711b-4423-bae1-91b55e84e14a" containerName="openstackclient" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.597758 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:18 crc kubenswrapper[4735]: E1008 13:19:18.600644 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 13:19:18 crc kubenswrapper[4735]: E1008 13:19:18.600719 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data podName:644f0099-ec9c-42af-bbe8-54fe5b603bb3 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:19.100696465 +0000 UTC m=+1408.909001009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data") pod "rabbitmq-server-0" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3") : configmap "rabbitmq-config-data" not found Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.611606 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron936c-account-delete-hq47l"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.705405 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lf82\" (UniqueName: \"kubernetes.io/projected/f92cd291-9315-4b44-b318-6f029f253de6-kube-api-access-2lf82\") pod \"neutron936c-account-delete-hq47l\" (UID: \"f92cd291-9315-4b44-b318-6f029f253de6\") " pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.765293 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicanc85c-account-delete-56lmd"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.766828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.778982 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanc85c-account-delete-56lmd"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.798354 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-xjdpv"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.798771 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-xjdpv" podUID="3e294ddd-2861-4cfb-97b5-d4283cc37e4e" containerName="openstack-network-exporter" containerID="cri-o://6876d906c666c75cba8b23303f73e98ddec735188d7cc7cbc4fd832d9f9fa680" gracePeriod=30 Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.810499 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8db7\" (UniqueName: \"kubernetes.io/projected/0172640d-d560-4ef0-a7e4-b649ccfa92e4-kube-api-access-l8db7\") pod \"barbicanc85c-account-delete-56lmd\" (UID: \"0172640d-d560-4ef0-a7e4-b649ccfa92e4\") " pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.810792 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lf82\" (UniqueName: \"kubernetes.io/projected/f92cd291-9315-4b44-b318-6f029f253de6-kube-api-access-2lf82\") pod \"neutron936c-account-delete-hq47l\" (UID: \"f92cd291-9315-4b44-b318-6f029f253de6\") " pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.812107 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder4fb6-account-delete-dvvxd"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.813515 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.831753 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder4fb6-account-delete-dvvxd"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.846337 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.936619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lf82\" (UniqueName: \"kubernetes.io/projected/f92cd291-9315-4b44-b318-6f029f253de6-kube-api-access-2lf82\") pod \"neutron936c-account-delete-hq47l\" (UID: \"f92cd291-9315-4b44-b318-6f029f253de6\") " pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.944056 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" probeResult="failure" output=< Oct 08 13:19:18 crc kubenswrapper[4735]: 2025-10-08T13:19:18Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Oct 08 13:19:18 crc kubenswrapper[4735]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Oct 08 13:19:18 crc kubenswrapper[4735]: 2025-10-08T13:19:18Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Oct 08 13:19:18 crc kubenswrapper[4735]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Oct 08 13:19:18 crc kubenswrapper[4735]: > Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.962406 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2sg\" (UniqueName: \"kubernetes.io/projected/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb-kube-api-access-mt2sg\") pod \"cinder4fb6-account-delete-dvvxd\" (UID: \"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb\") " pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:18 crc kubenswrapper[4735]: I1008 13:19:18.963052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8db7\" (UniqueName: \"kubernetes.io/projected/0172640d-d560-4ef0-a7e4-b649ccfa92e4-kube-api-access-l8db7\") pod \"barbicanc85c-account-delete-56lmd\" (UID: \"0172640d-d560-4ef0-a7e4-b649ccfa92e4\") " pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.106345 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.106607 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2sg\" (UniqueName: \"kubernetes.io/projected/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb-kube-api-access-mt2sg\") pod \"cinder4fb6-account-delete-dvvxd\" (UID: \"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb\") " pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.106640 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.106716 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data podName:644f0099-ec9c-42af-bbe8-54fe5b603bb3 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:20.10669538 +0000 UTC m=+1409.914999984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data") pod "rabbitmq-server-0" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3") : configmap "rabbitmq-config-data" not found Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.108802 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8db7\" (UniqueName: \"kubernetes.io/projected/0172640d-d560-4ef0-a7e4-b649ccfa92e4-kube-api-access-l8db7\") pod \"barbicanc85c-account-delete-56lmd\" (UID: \"0172640d-d560-4ef0-a7e4-b649ccfa92e4\") " pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.119051 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.119163 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data podName:01ad2819-cfc3-4416-8a27-ecb65c2fa366 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:19.619123842 +0000 UTC m=+1409.427428466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data") pod "rabbitmq-cell1-server-0" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366") : configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.205545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2sg\" (UniqueName: \"kubernetes.io/projected/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb-kube-api-access-mt2sg\") pod \"cinder4fb6-account-delete-dvvxd\" (UID: \"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb\") " pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.279332 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe64ebc-f229-4573-a67c-44c0d573c336" path="/var/lib/kubelet/pods/8fe64ebc-f229-4573-a67c-44c0d573c336/volumes" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.280049 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pbkkt"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.280091 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9frwj"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.327292 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement364a-account-delete-dg8rs"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.328756 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.352655 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement364a-account-delete-dg8rs"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.355275 4735 generic.go:334] "Generic (PLEG): container finished" podID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerID="14207a6e2a9596d194069fc196983081842e33d540bef5d9124e59a802e7d234" exitCode=2 Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.355339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6","Type":"ContainerDied","Data":"14207a6e2a9596d194069fc196983081842e33d540bef5d9124e59a802e7d234"} Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.363750 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance55cc-account-delete-cnr66"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.365138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.381264 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xjdpv_3e294ddd-2861-4cfb-97b5-d4283cc37e4e/openstack-network-exporter/0.log" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.381288 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance55cc-account-delete-cnr66"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.381319 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xjdpv" event={"ID":"3e294ddd-2861-4cfb-97b5-d4283cc37e4e","Type":"ContainerDied","Data":"6876d906c666c75cba8b23303f73e98ddec735188d7cc7cbc4fd832d9f9fa680"} Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.381319 4735 generic.go:334] "Generic (PLEG): container finished" podID="3e294ddd-2861-4cfb-97b5-d4283cc37e4e" containerID="6876d906c666c75cba8b23303f73e98ddec735188d7cc7cbc4fd832d9f9fa680" exitCode=2 Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.391815 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cdrp4"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.399994 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.411807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8zq\" (UniqueName: \"kubernetes.io/projected/6fe2bc1d-7e65-45ba-9d42-cae899859258-kube-api-access-xf8zq\") pod \"placement364a-account-delete-dg8rs\" (UID: \"6fe2bc1d-7e65-45ba-9d42-cae899859258\") " pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.413236 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cdrp4"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.438922 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell066dd-account-delete-wnksz"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.448845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.469970 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.470825 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nplcf"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.498503 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nplcf"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.517610 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell066dd-account-delete-wnksz"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.522854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8zq\" (UniqueName: \"kubernetes.io/projected/6fe2bc1d-7e65-45ba-9d42-cae899859258-kube-api-access-xf8zq\") pod \"placement364a-account-delete-dg8rs\" (UID: \"6fe2bc1d-7e65-45ba-9d42-cae899859258\") " pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.523046 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jchv\" (UniqueName: \"kubernetes.io/projected/f952e88e-fe68-4770-874c-fdee88d43aee-kube-api-access-2jchv\") pod \"glance55cc-account-delete-cnr66\" (UID: \"f952e88e-fe68-4770-874c-fdee88d43aee\") " pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.548228 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fz5qt"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.580323 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8zq\" (UniqueName: \"kubernetes.io/projected/6fe2bc1d-7e65-45ba-9d42-cae899859258-kube-api-access-xf8zq\") pod \"placement364a-account-delete-dg8rs\" (UID: \"6fe2bc1d-7e65-45ba-9d42-cae899859258\") " pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.605567 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fz5qt"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.618721 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-c8pnx"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.640005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jchv\" (UniqueName: \"kubernetes.io/projected/f952e88e-fe68-4770-874c-fdee88d43aee-kube-api-access-2jchv\") pod \"glance55cc-account-delete-cnr66\" (UID: \"f952e88e-fe68-4770-874c-fdee88d43aee\") " pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.640085 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8z2\" (UniqueName: \"kubernetes.io/projected/9d6e2da8-2dc0-4642-b216-218ed61cc813-kube-api-access-hc8z2\") pod \"novacell066dd-account-delete-wnksz\" (UID: \"9d6e2da8-2dc0-4642-b216-218ed61cc813\") " pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.640159 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.640234 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data podName:01ad2819-cfc3-4416-8a27-ecb65c2fa366 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:20.640216199 +0000 UTC m=+1410.448520743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data") pod "rabbitmq-cell1-server-0" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366") : configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.643290 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-c8pnx"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.669577 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-nhxn5"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.692849 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jchv\" (UniqueName: \"kubernetes.io/projected/f952e88e-fe68-4770-874c-fdee88d43aee-kube-api-access-2jchv\") pod \"glance55cc-account-delete-cnr66\" (UID: \"f952e88e-fe68-4770-874c-fdee88d43aee\") " pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.716312 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-nhxn5"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.740090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.742502 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8z2\" (UniqueName: \"kubernetes.io/projected/9d6e2da8-2dc0-4642-b216-218ed61cc813-kube-api-access-hc8z2\") pod \"novacell066dd-account-delete-wnksz\" (UID: \"9d6e2da8-2dc0-4642-b216-218ed61cc813\") " pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.774740 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" probeResult="failure" output="command timed out" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.777613 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xjdpv_3e294ddd-2861-4cfb-97b5-d4283cc37e4e/openstack-network-exporter/0.log" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.777689 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.780278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8z2\" (UniqueName: \"kubernetes.io/projected/9d6e2da8-2dc0-4642-b216-218ed61cc813-kube-api-access-hc8z2\") pod \"novacell066dd-account-delete-wnksz\" (UID: \"9d6e2da8-2dc0-4642-b216-218ed61cc813\") " pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.827279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.877060 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zkbts"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.897590 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zkbts"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.909950 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.910843 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="openstack-network-exporter" containerID="cri-o://a762e323d1f3da2b5f91ff63a2787457f4dc57f2b638bc08715406fafaabc87e" gracePeriod=300 Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.916471 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.917010 4735 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-pbkkt" message=< Oct 08 13:19:19 crc kubenswrapper[4735]: Exiting ovn-controller (1) [ OK ] Oct 08 13:19:19 crc kubenswrapper[4735]: > Oct 08 13:19:19 crc kubenswrapper[4735]: E1008 13:19:19.917036 4735 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-pbkkt" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerName="ovn-controller" containerID="cri-o://d30b6d8ff48994cf5b030b6578cb0bfd0a226d0320a05c48eaa37fda8c089985" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.917066 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-pbkkt" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerName="ovn-controller" containerID="cri-o://d30b6d8ff48994cf5b030b6578cb0bfd0a226d0320a05c48eaa37fda8c089985" gracePeriod=30 Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.949223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-metrics-certs-tls-certs\") pod \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.949390 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-config\") pod \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.949492 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovn-rundir\") pod \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.949518 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovs-rundir\") pod \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.949568 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-combined-ca-bundle\") pod \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.949680 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqnsl\" (UniqueName: \"kubernetes.io/projected/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-kube-api-access-tqnsl\") pod \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\" (UID: \"3e294ddd-2861-4cfb-97b5-d4283cc37e4e\") " Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.950148 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3e294ddd-2861-4cfb-97b5-d4283cc37e4e" (UID: "3e294ddd-2861-4cfb-97b5-d4283cc37e4e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.951542 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.952559 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "3e294ddd-2861-4cfb-97b5-d4283cc37e4e" (UID: "3e294ddd-2861-4cfb-97b5-d4283cc37e4e"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.958303 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-config" (OuterVolumeSpecName: "config") pod "3e294ddd-2861-4cfb-97b5-d4283cc37e4e" (UID: "3e294ddd-2861-4cfb-97b5-d4283cc37e4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.963430 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-7gpr6"] Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.963793 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerName="dnsmasq-dns" containerID="cri-o://1c0e8f5f9cd61eb363c77155195f6cdd017ff1a8da9e953758ecbda931689b32" gracePeriod=10 Oct 08 13:19:19 crc kubenswrapper[4735]: I1008 13:19:19.987756 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-kube-api-access-tqnsl" (OuterVolumeSpecName: "kube-api-access-tqnsl") pod "3e294ddd-2861-4cfb-97b5-d4283cc37e4e" (UID: "3e294ddd-2861-4cfb-97b5-d4283cc37e4e"). InnerVolumeSpecName "kube-api-access-tqnsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.019680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e294ddd-2861-4cfb-97b5-d4283cc37e4e" (UID: "3e294ddd-2861-4cfb-97b5-d4283cc37e4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.024973 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="ovsdbserver-sb" containerID="cri-o://a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341" gracePeriod=300 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.034223 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5574db6b6f-94l7x"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.034541 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5574db6b6f-94l7x" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-api" containerID="cri-o://640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.034705 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5574db6b6f-94l7x" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-httpd" containerID="cri-o://b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.053464 4735 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.053581 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.053595 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqnsl\" (UniqueName: \"kubernetes.io/projected/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-kube-api-access-tqnsl\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.053607 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.075708 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.076251 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="openstack-network-exporter" containerID="cri-o://72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b" gracePeriod=300 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.112842 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.113616 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="cinder-scheduler" containerID="cri-o://f976efe16a474e89b33e73cf876bd99fd72a4f29125be28a2094105456065a00" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.114550 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="probe" containerID="cri-o://b7bc2b861ffd2f1d3a1ede5b87a767f230be0eb8b6d7f935ee21e11963a9f933" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.149700 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" containerID="cri-o://dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.158986 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.159102 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data podName:644f0099-ec9c-42af-bbe8-54fe5b603bb3 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:22.159076877 +0000 UTC m=+1411.967381421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data") pod "rabbitmq-server-0" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3") : configmap "rabbitmq-config-data" not found Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.161239 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zfmvf"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.207908 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-zfmvf"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.212405 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3e294ddd-2861-4cfb-97b5-d4283cc37e4e" (UID: "3e294ddd-2861-4cfb-97b5-d4283cc37e4e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.260203 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e294ddd-2861-4cfb-97b5-d4283cc37e4e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.292345 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8555f7d884-2km9p"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.292963 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8555f7d884-2km9p" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-log" containerID="cri-o://c6fb989e32326dc473a47f896284b872f9d86ea0f3de39c8b6cff9255ec1e38f" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.293747 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8555f7d884-2km9p" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-api" containerID="cri-o://74e1faa845324974182d87cae1bcb7ea97976a1ed4553d25c3e02f8144d91a7f" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.311422 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="ovsdbserver-nb" containerID="cri-o://57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736" gracePeriod=300 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.364237 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365171 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-server" containerID="cri-o://fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365659 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-updater" containerID="cri-o://f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365781 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-auditor" containerID="cri-o://1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365820 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="swift-recon-cron" containerID="cri-o://a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365851 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-replicator" containerID="cri-o://0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365876 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="rsync" containerID="cri-o://dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365917 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-server" containerID="cri-o://33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365935 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-expirer" containerID="cri-o://4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365983 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-updater" containerID="cri-o://d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.365967 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-reaper" containerID="cri-o://318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.366027 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-auditor" containerID="cri-o://237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.366047 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-auditor" containerID="cri-o://d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.366068 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-replicator" containerID="cri-o://2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.366120 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-replicator" containerID="cri-o://f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.366124 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-server" containerID="cri-o://309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.469944 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xjdpv_3e294ddd-2861-4cfb-97b5-d4283cc37e4e/openstack-network-exporter/0.log" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.470157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xjdpv" event={"ID":"3e294ddd-2861-4cfb-97b5-d4283cc37e4e","Type":"ContainerDied","Data":"16ea78a34a39e88b5828bc6215331316866ec19e379f102c1f6c454c239b9353"} Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.470227 4735 scope.go:117] "RemoveContainer" containerID="6876d906c666c75cba8b23303f73e98ddec735188d7cc7cbc4fd832d9f9fa680" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.470558 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xjdpv" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.509824 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" event={"ID":"1f333daa-559f-421a-b0a9-fa1de98b72cf","Type":"ContainerDied","Data":"1c0e8f5f9cd61eb363c77155195f6cdd017ff1a8da9e953758ecbda931689b32"} Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.503278 4735 generic.go:334] "Generic (PLEG): container finished" podID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerID="1c0e8f5f9cd61eb363c77155195f6cdd017ff1a8da9e953758ecbda931689b32" exitCode=0 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.550156 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_647cfc89-264d-4616-ab81-64a7c9aa51c5/ovsdbserver-sb/0.log" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.550303 4735 generic.go:334] "Generic (PLEG): container finished" podID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerID="a762e323d1f3da2b5f91ff63a2787457f4dc57f2b638bc08715406fafaabc87e" exitCode=2 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.550370 4735 generic.go:334] "Generic (PLEG): container finished" podID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerID="a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341" exitCode=143 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.550597 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"647cfc89-264d-4616-ab81-64a7c9aa51c5","Type":"ContainerDied","Data":"a762e323d1f3da2b5f91ff63a2787457f4dc57f2b638bc08715406fafaabc87e"} Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.550646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"647cfc89-264d-4616-ab81-64a7c9aa51c5","Type":"ContainerDied","Data":"a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341"} Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.563070 4735 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 08 13:19:20 crc kubenswrapper[4735]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 13:19:20 crc kubenswrapper[4735]: + source /usr/local/bin/container-scripts/functions Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNBridge=br-int Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNRemote=tcp:localhost:6642 Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNEncapType=geneve Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNAvailabilityZones= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ EnableChassisAsGateway=true Oct 08 13:19:20 crc kubenswrapper[4735]: ++ PhysicalNetworks= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNHostName= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 13:19:20 crc kubenswrapper[4735]: ++ ovs_dir=/var/lib/openvswitch Oct 08 13:19:20 crc kubenswrapper[4735]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 13:19:20 crc kubenswrapper[4735]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 13:19:20 crc kubenswrapper[4735]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + sleep 0.5 Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + sleep 0.5 Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + cleanup_ovsdb_server_semaphore Oct 08 13:19:20 crc kubenswrapper[4735]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 13:19:20 crc kubenswrapper[4735]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 13:19:20 crc kubenswrapper[4735]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-9frwj" message=< Oct 08 13:19:20 crc kubenswrapper[4735]: Exiting ovsdb-server (5) [ OK ] Oct 08 13:19:20 crc kubenswrapper[4735]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 13:19:20 crc kubenswrapper[4735]: + source /usr/local/bin/container-scripts/functions Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNBridge=br-int Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNRemote=tcp:localhost:6642 Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNEncapType=geneve Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNAvailabilityZones= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ EnableChassisAsGateway=true Oct 08 13:19:20 crc kubenswrapper[4735]: ++ PhysicalNetworks= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNHostName= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 13:19:20 crc kubenswrapper[4735]: ++ ovs_dir=/var/lib/openvswitch Oct 08 13:19:20 crc kubenswrapper[4735]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 13:19:20 crc kubenswrapper[4735]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 13:19:20 crc kubenswrapper[4735]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + sleep 0.5 Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + sleep 0.5 Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + cleanup_ovsdb_server_semaphore Oct 08 13:19:20 crc kubenswrapper[4735]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 13:19:20 crc kubenswrapper[4735]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 13:19:20 crc kubenswrapper[4735]: > Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.563120 4735 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 08 13:19:20 crc kubenswrapper[4735]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 13:19:20 crc kubenswrapper[4735]: + source /usr/local/bin/container-scripts/functions Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNBridge=br-int Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNRemote=tcp:localhost:6642 Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNEncapType=geneve Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNAvailabilityZones= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ EnableChassisAsGateway=true Oct 08 13:19:20 crc kubenswrapper[4735]: ++ PhysicalNetworks= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ OVNHostName= Oct 08 13:19:20 crc kubenswrapper[4735]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 13:19:20 crc kubenswrapper[4735]: ++ ovs_dir=/var/lib/openvswitch Oct 08 13:19:20 crc kubenswrapper[4735]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 13:19:20 crc kubenswrapper[4735]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 13:19:20 crc kubenswrapper[4735]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + sleep 0.5 Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + sleep 0.5 Oct 08 13:19:20 crc kubenswrapper[4735]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 13:19:20 crc kubenswrapper[4735]: + cleanup_ovsdb_server_semaphore Oct 08 13:19:20 crc kubenswrapper[4735]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 13:19:20 crc kubenswrapper[4735]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 13:19:20 crc kubenswrapper[4735]: > pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" containerID="cri-o://0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.563158 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" containerID="cri-o://0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" gracePeriod=29 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.568851 4735 generic.go:334] "Generic (PLEG): container finished" podID="27af208c-711b-4423-bae1-91b55e84e14a" containerID="ce728a0dd44fff56f85bc83d0ecfb0de2a06e0305143dfb8dc7aa43bf3d0dbe6" exitCode=137 Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.585788 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341 is running failed: container process not found" containerID="a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.586051 4735 generic.go:334] "Generic (PLEG): container finished" podID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerID="d30b6d8ff48994cf5b030b6578cb0bfd0a226d0320a05c48eaa37fda8c089985" exitCode=0 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.586173 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt" event={"ID":"73b19a4f-4794-49fc-bfd9-727618f11e7c","Type":"ContainerDied","Data":"d30b6d8ff48994cf5b030b6578cb0bfd0a226d0320a05c48eaa37fda8c089985"} Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.586268 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341 is running failed: container process not found" containerID="a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.586737 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341 is running failed: container process not found" containerID="a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.586777 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="ovsdbserver-sb" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.598803 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f1329e-e0dc-4295-8dfc-92fae02065f2/ovsdbserver-nb/0.log" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.598841 4735 generic.go:334] "Generic (PLEG): container finished" podID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerID="72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b" exitCode=2 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.598869 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f1329e-e0dc-4295-8dfc-92fae02065f2","Type":"ContainerDied","Data":"72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b"} Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.702588 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:20 crc kubenswrapper[4735]: E1008 13:19:20.702653 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data podName:01ad2819-cfc3-4416-8a27-ecb65c2fa366 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:22.702636785 +0000 UTC m=+1412.510941329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data") pod "rabbitmq-cell1-server-0" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366") : configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.771201 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.771551 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api-log" containerID="cri-o://ef4967d1505370e0cd47289c67232c6f8769a456bdabc3b30bd020b8c2020a1e" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.772106 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api" containerID="cri-o://52bb4221516a6482adaee2563d79681bb7ba3599c014219c89382ac18144524b" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.837576 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.837782 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-log" containerID="cri-o://00bb3e3d5baf3ca592174014e03b36040aa0b638cb8a1c2566220b5a3cb65e9b" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.838308 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-httpd" containerID="cri-o://7cbf4814a5d8742756a2f0076ccbdc83fc3186c93b8940d1bc61f10728f5e789" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: W1008 13:19:20.838652 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4189bc5a_2f23_4b53_9df4_e97a65a7d8eb.slice/crio-8c6fca696eefb52e052a4ad3785a57e0494a1c06e43a5610d33e07d0c65557c9 WatchSource:0}: Error finding container 8c6fca696eefb52e052a4ad3785a57e0494a1c06e43a5610d33e07d0c65557c9: Status 404 returned error can't find the container with id 8c6fca696eefb52e052a4ad3785a57e0494a1c06e43a5610d33e07d0c65557c9 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.954118 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd43efe-433d-4590-bcf9-3304cd6e32fb" path="/var/lib/kubelet/pods/2dd43efe-433d-4590-bcf9-3304cd6e32fb/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.956179 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7386076f-50b6-4cc9-b774-da32d6add528" path="/var/lib/kubelet/pods/7386076f-50b6-4cc9-b774-da32d6add528/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.956964 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c9b6bd7-6443-42da-a807-ff8fd5abbc8a" path="/var/lib/kubelet/pods/7c9b6bd7-6443-42da-a807-ff8fd5abbc8a/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.958535 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f8ee345-a4ac-49aa-9eed-32a7cffb2790" path="/var/lib/kubelet/pods/8f8ee345-a4ac-49aa-9eed-32a7cffb2790/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.961317 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6559ca-82f8-4271-bb6b-47f154089579" path="/var/lib/kubelet/pods/cb6559ca-82f8-4271-bb6b-47f154089579/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.963592 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c034b6-95c9-42d6-8490-0d759b602d82" path="/var/lib/kubelet/pods/e0c034b6-95c9-42d6-8490-0d759b602d82/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.964235 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee80dca5-9f7b-47ca-82e9-2c210a8c507d" path="/var/lib/kubelet/pods/ee80dca5-9f7b-47ca-82e9-2c210a8c507d/volumes" Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.965690 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.966381 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-log" containerID="cri-o://6fcbdbcd2a5b227d1b93c5af337c1732e6685ef4e33f54e3bddf01cee27e6906" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.966899 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-metadata" containerID="cri-o://418d5ac0c79b3bbb212fb99f2df15049ca6cc912c6272d8bf5adfab9f972e7cf" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.971901 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.982568 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.982810 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-log" containerID="cri-o://205355fb6f24a0939888db6199eeaa319bf35bb6b065a98fc683f3a28c8fa854" gracePeriod=30 Oct 08 13:19:20 crc kubenswrapper[4735]: I1008 13:19:20.982897 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-api" containerID="cri-o://d6ed228dfc5748efb601fafc99f157a758cb829f235af05f716ef363f89fcc14" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.000257 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.000845 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-httpd" containerID="cri-o://89675ae013264bf40ddb13e38257d27a7ce994542ac6e288c014f3d00f9d0b13" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.000470 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-log" containerID="cri-o://59fafd30b9822a732b940ca8ee5e9d58f1e7139b22b56b4a8c5b227e38ab23a0" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.021272 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.030429 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9h9hg"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.038465 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9h9hg"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.039814 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.050197 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6cc55c9557-96kwz"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.050645 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6cc55c9557-96kwz" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker-log" containerID="cri-o://7d7c87d703a723cb263ee5030a259e665d68c85efbd777b9a5b704b5476ab749" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.050737 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6cc55c9557-96kwz" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker" containerID="cri-o://6d64df6cc879792395fce1a8a219ef400d1f7cfe418d3ee5312fec24e31fc592" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.067470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_647cfc89-264d-4616-ab81-64a7c9aa51c5/ovsdbserver-sb/0.log" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.067558 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.071939 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.077981 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-726a-account-create-5szhf"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.086312 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerName="rabbitmq" containerID="cri-o://898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356" gracePeriod=604800 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.100186 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-726a-account-create-5szhf"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.107705 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-69dd69556d-pvx5r"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.107924 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener-log" containerID="cri-o://d5b88355cbab2ef2c4641f67a06c04f010a0ffe1ebf2da8598d0977f7187066a" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.108037 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener" containerID="cri-o://9f5626323fe1058e9c1fda7509e11efd3640670cee71f3ea2717d062bae876cf" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.115895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-swift-storage-0\") pod \"1f333daa-559f-421a-b0a9-fa1de98b72cf\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.115961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-sb\") pod \"1f333daa-559f-421a-b0a9-fa1de98b72cf\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.116025 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-config\") pod \"1f333daa-559f-421a-b0a9-fa1de98b72cf\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.116061 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-svc\") pod \"1f333daa-559f-421a-b0a9-fa1de98b72cf\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.116145 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-nb\") pod \"1f333daa-559f-421a-b0a9-fa1de98b72cf\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.116229 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf4b6\" (UniqueName: \"kubernetes.io/projected/1f333daa-559f-421a-b0a9-fa1de98b72cf-kube-api-access-vf4b6\") pod \"1f333daa-559f-421a-b0a9-fa1de98b72cf\" (UID: \"1f333daa-559f-421a-b0a9-fa1de98b72cf\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.121124 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pk6wn"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.140669 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f333daa-559f-421a-b0a9-fa1de98b72cf-kube-api-access-vf4b6" (OuterVolumeSpecName: "kube-api-access-vf4b6") pod "1f333daa-559f-421a-b0a9-fa1de98b72cf" (UID: "1f333daa-559f-421a-b0a9-fa1de98b72cf"). InnerVolumeSpecName "kube-api-access-vf4b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.140764 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c85c-account-create-sjbsn"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.149044 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pk6wn"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.185418 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c85c-account-create-sjbsn"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.186300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.212916 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cqm49"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.223154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run-ovn\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.223211 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-scripts\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.223239 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdb-rundir\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.223274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.223921 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-combined-ca-bundle\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.223978 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d852b\" (UniqueName: \"kubernetes.io/projected/647cfc89-264d-4616-ab81-64a7c9aa51c5-kube-api-access-d852b\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224004 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-combined-ca-bundle\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224063 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-metrics-certs-tls-certs\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224099 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-log-ovn\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224160 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-config\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224247 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-ovn-controller-tls-certs\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224276 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqbpj\" (UniqueName: \"kubernetes.io/projected/73b19a4f-4794-49fc-bfd9-727618f11e7c-kube-api-access-mqbpj\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224299 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73b19a4f-4794-49fc-bfd9-727618f11e7c-scripts\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run\") pod \"73b19a4f-4794-49fc-bfd9-727618f11e7c\" (UID: \"73b19a4f-4794-49fc-bfd9-727618f11e7c\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224392 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdbserver-sb-tls-certs\") pod \"647cfc89-264d-4616-ab81-64a7c9aa51c5\" (UID: \"647cfc89-264d-4616-ab81-64a7c9aa51c5\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224898 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.224962 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-config" (OuterVolumeSpecName: "config") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.225020 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run" (OuterVolumeSpecName: "var-run") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.225173 4735 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.225188 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.225198 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf4b6\" (UniqueName: \"kubernetes.io/projected/1f333daa-559f-421a-b0a9-fa1de98b72cf-kube-api-access-vf4b6\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.225208 4735 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.225216 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.226226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b19a4f-4794-49fc-bfd9-727618f11e7c-scripts" (OuterVolumeSpecName: "scripts") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.228052 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-scripts" (OuterVolumeSpecName: "scripts") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.234343 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.236592 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f333daa-559f-421a-b0a9-fa1de98b72cf" (UID: "1f333daa-559f-421a-b0a9-fa1de98b72cf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.252976 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b19a4f-4794-49fc-bfd9-727618f11e7c-kube-api-access-mqbpj" (OuterVolumeSpecName: "kube-api-access-mqbpj") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "kube-api-access-mqbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.253391 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc85c-account-delete-56lmd"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.260010 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4fb6-account-create-wgj75"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.263280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647cfc89-264d-4616-ab81-64a7c9aa51c5-kube-api-access-d852b" (OuterVolumeSpecName: "kube-api-access-d852b") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "kube-api-access-d852b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.267775 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="galera" containerID="cri-o://c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.269992 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cqm49"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.304251 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qcsss"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.314609 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.327636 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config\") pod \"27af208c-711b-4423-bae1-91b55e84e14a\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.327705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-combined-ca-bundle\") pod \"27af208c-711b-4423-bae1-91b55e84e14a\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.327936 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config-secret\") pod \"27af208c-711b-4423-bae1-91b55e84e14a\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.327982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48xz\" (UniqueName: \"kubernetes.io/projected/27af208c-711b-4423-bae1-91b55e84e14a-kube-api-access-m48xz\") pod \"27af208c-711b-4423-bae1-91b55e84e14a\" (UID: \"27af208c-711b-4423-bae1-91b55e84e14a\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328621 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328649 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqbpj\" (UniqueName: \"kubernetes.io/projected/73b19a4f-4794-49fc-bfd9-727618f11e7c-kube-api-access-mqbpj\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328664 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73b19a4f-4794-49fc-bfd9-727618f11e7c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328677 4735 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/73b19a4f-4794-49fc-bfd9-727618f11e7c-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328688 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647cfc89-264d-4616-ab81-64a7c9aa51c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328711 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328724 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.328735 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d852b\" (UniqueName: \"kubernetes.io/projected/647cfc89-264d-4616-ab81-64a7c9aa51c5-kube-api-access-d852b\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.334849 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-86a1-account-create-xgf9d"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.345678 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-86a1-account-create-xgf9d"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.358845 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4fb6-account-delete-dvvxd"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.359857 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f333daa-559f-421a-b0a9-fa1de98b72cf" (UID: "1f333daa-559f-421a-b0a9-fa1de98b72cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.371933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27af208c-711b-4423-bae1-91b55e84e14a-kube-api-access-m48xz" (OuterVolumeSpecName: "kube-api-access-m48xz") pod "27af208c-711b-4423-bae1-91b55e84e14a" (UID: "27af208c-711b-4423-bae1-91b55e84e14a"). InnerVolumeSpecName "kube-api-access-m48xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.372011 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qcsss"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.393113 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4fb6-account-create-wgj75"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.405944 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron936c-account-delete-hq47l"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.415726 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.416155 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-config" (OuterVolumeSpecName: "config") pod "1f333daa-559f-421a-b0a9-fa1de98b72cf" (UID: "1f333daa-559f-421a-b0a9-fa1de98b72cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.417067 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5vqkx"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.430826 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-364a-account-create-wfnvv"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.431926 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.431962 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.431973 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.431985 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48xz\" (UniqueName: \"kubernetes.io/projected/27af208c-711b-4423-bae1-91b55e84e14a-kube-api-access-m48xz\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.437233 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5vqkx"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.438709 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.449083 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement364a-account-delete-dg8rs"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.456080 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-364a-account-create-wfnvv"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.481208 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f333daa-559f-421a-b0a9-fa1de98b72cf" (UID: "1f333daa-559f-421a-b0a9-fa1de98b72cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.482045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f333daa-559f-421a-b0a9-fa1de98b72cf" (UID: "1f333daa-559f-421a-b0a9-fa1de98b72cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.494396 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-krzkw"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.535907 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.535935 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.535945 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f333daa-559f-421a-b0a9-fa1de98b72cf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.536197 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dd5468f46-ccf7d"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.542721 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dd5468f46-ccf7d" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api-log" containerID="cri-o://719ed4d6e6ca3be4b585ba42337fe3c6bf926775f7cb265506efe244531bac65" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.542862 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dd5468f46-ccf7d" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api" containerID="cri-o://54ae09e0758dbced8f136de91ec860924f9d0a4a1aa8e35c4a62a56d25f7b734" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.575180 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "27af208c-711b-4423-bae1-91b55e84e14a" (UID: "27af208c-711b-4423-bae1-91b55e84e14a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.587281 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f1329e-e0dc-4295-8dfc-92fae02065f2/ovsdbserver-nb/0.log" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.587423 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 13:19:21 crc kubenswrapper[4735]: E1008 13:19:21.599742 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.599908 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-krzkw"] Oct 08 13:19:21 crc kubenswrapper[4735]: E1008 13:19:21.605977 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 08 13:19:21 crc kubenswrapper[4735]: E1008 13:19:21.612806 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 08 13:19:21 crc kubenswrapper[4735]: E1008 13:19:21.612929 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="galera" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.654185 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "73b19a4f-4794-49fc-bfd9-727618f11e7c" (UID: "73b19a4f-4794-49fc-bfd9-727618f11e7c"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.669142 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-55cc-account-create-cfvt9"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670459 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdbserver-nb-tls-certs\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670522 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-metrics-certs-tls-certs\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-combined-ca-bundle\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-config\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670630 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-scripts\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670650 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdb-rundir\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670671 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670745 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdv8k\" (UniqueName: \"kubernetes.io/projected/52f1329e-e0dc-4295-8dfc-92fae02065f2-kube-api-access-pdv8k\") pod \"52f1329e-e0dc-4295-8dfc-92fae02065f2\" (UID: \"52f1329e-e0dc-4295-8dfc-92fae02065f2\") " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.670991 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b19a4f-4794-49fc-bfd9-727618f11e7c-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.671003 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.678544 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-config" (OuterVolumeSpecName: "config") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.692833 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.694850 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-scripts" (OuterVolumeSpecName: "scripts") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.702049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f1329e-e0dc-4295-8dfc-92fae02065f2-kube-api-access-pdv8k" (OuterVolumeSpecName: "kube-api-access-pdv8k") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "kube-api-access-pdv8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.704744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.717412 4735 generic.go:334] "Generic (PLEG): container finished" podID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerID="c6fb989e32326dc473a47f896284b872f9d86ea0f3de39c8b6cff9255ec1e38f" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.717571 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-55cc-account-create-cfvt9"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.717598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8555f7d884-2km9p" event={"ID":"70450186-9a74-4ba9-92f9-3f3a9626d0a0","Type":"ContainerDied","Data":"c6fb989e32326dc473a47f896284b872f9d86ea0f3de39c8b6cff9255ec1e38f"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.720427 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance55cc-account-delete-cnr66"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.724344 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.727676 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-49lbk"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.730970 4735 generic.go:334] "Generic (PLEG): container finished" podID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerID="d5b88355cbab2ef2c4641f67a06c04f010a0ffe1ebf2da8598d0977f7187066a" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.731049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" event={"ID":"9b111975-076c-4c7f-bcbd-ecef0b62936f","Type":"ContainerDied","Data":"d5b88355cbab2ef2c4641f67a06c04f010a0ffe1ebf2da8598d0977f7187066a"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.732897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc85c-account-delete-56lmd" event={"ID":"0172640d-d560-4ef0-a7e4-b649ccfa92e4","Type":"ContainerStarted","Data":"3417eaa999269a1fec9d77454c9b363cdcab58e3009ef984c032ae02c37c88d0"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.736717 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-49lbk"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.739883 4735 generic.go:334] "Generic (PLEG): container finished" podID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerID="ef4967d1505370e0cd47289c67232c6f8769a456bdabc3b30bd020b8c2020a1e" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.739963 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2","Type":"ContainerDied","Data":"ef4967d1505370e0cd47289c67232c6f8769a456bdabc3b30bd020b8c2020a1e"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.741394 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27af208c-711b-4423-bae1-91b55e84e14a" (UID: "27af208c-711b-4423-bae1-91b55e84e14a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.741467 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-66dd-account-create-4q8hl"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.748177 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell066dd-account-delete-wnksz"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.751751 4735 generic.go:334] "Generic (PLEG): container finished" podID="ada58119-3f96-4962-b214-fb2662c3d935" containerID="7d7c87d703a723cb263ee5030a259e665d68c85efbd777b9a5b704b5476ab749" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.751809 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cc55c9557-96kwz" event={"ID":"ada58119-3f96-4962-b214-fb2662c3d935","Type":"ContainerDied","Data":"7d7c87d703a723cb263ee5030a259e665d68c85efbd777b9a5b704b5476ab749"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.755277 4735 generic.go:334] "Generic (PLEG): container finished" podID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerID="59fafd30b9822a732b940ca8ee5e9d58f1e7139b22b56b4a8c5b227e38ab23a0" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.755331 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8c473e5-0132-4ee7-a14e-ada0780d358c","Type":"ContainerDied","Data":"59fafd30b9822a732b940ca8ee5e9d58f1e7139b22b56b4a8c5b227e38ab23a0"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.755359 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-66dd-account-create-4q8hl"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.761162 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_647cfc89-264d-4616-ab81-64a7c9aa51c5/ovsdbserver-sb/0.log" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.761251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"647cfc89-264d-4616-ab81-64a7c9aa51c5","Type":"ContainerDied","Data":"61736668a98b5785470a3aacd4c66220bb840a6df0f0287741176be813cc958a"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.761305 4735 scope.go:117] "RemoveContainer" containerID="a762e323d1f3da2b5f91ff63a2787457f4dc57f2b638bc08715406fafaabc87e" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.761554 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.763019 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4fb6-account-delete-dvvxd"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773834 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773868 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdv8k\" (UniqueName: \"kubernetes.io/projected/52f1329e-e0dc-4295-8dfc-92fae02065f2-kube-api-access-pdv8k\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773881 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773893 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773904 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52f1329e-e0dc-4295-8dfc-92fae02065f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773915 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.773949 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.775798 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.778419 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.786286 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-675db45d77-s6pvt"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.786732 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-675db45d77-s6pvt" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-httpd" containerID="cri-o://c41fba7730a1ea0efa3f14c62882c27ae55360beb38b9829009593be86e45cb8" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.786823 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-675db45d77-s6pvt" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-server" containerID="cri-o://eb379c44e7f01a76229af5fa811391aa9b991614e3a1cc0d30c5cec76be16709" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.806264 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.806549 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="31465d06-bf56-4173-95c1-3c13d776b1f9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e6bc8e996460d65d28494717119455e8c9a9619cda09617d493e57e6fb07bac3" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.807878 4735 generic.go:334] "Generic (PLEG): container finished" podID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerID="b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.807937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574db6b6f-94l7x" event={"ID":"74ecad81-3dfd-4f36-be79-699b6bd30e90","Type":"ContainerDied","Data":"b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.815969 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.816020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerDied","Data":"0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.817456 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrws"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.819194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4fb6-account-delete-dvvxd" event={"ID":"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb","Type":"ContainerStarted","Data":"8c6fca696eefb52e052a4ad3785a57e0494a1c06e43a5610d33e07d0c65557c9"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.821714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron936c-account-delete-hq47l" event={"ID":"f92cd291-9315-4b44-b318-6f029f253de6","Type":"ContainerStarted","Data":"33686f02fc026bb542e0f66bc1e14a5c9d2d923213d11e51c7b35053507fd7c5"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.823147 4735 generic.go:334] "Generic (PLEG): container finished" podID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerID="00bb3e3d5baf3ca592174014e03b36040aa0b638cb8a1c2566220b5a3cb65e9b" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.823179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f","Type":"ContainerDied","Data":"00bb3e3d5baf3ca592174014e03b36040aa0b638cb8a1c2566220b5a3cb65e9b"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.829690 4735 generic.go:334] "Generic (PLEG): container finished" podID="22f4e804-ab3e-4858-87b4-84a310c21249" containerID="b7bc2b861ffd2f1d3a1ede5b87a767f230be0eb8b6d7f935ee21e11963a9f933" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.829744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22f4e804-ab3e-4858-87b4-84a310c21249","Type":"ContainerDied","Data":"b7bc2b861ffd2f1d3a1ede5b87a767f230be0eb8b6d7f935ee21e11963a9f933"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.834372 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7sl5"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.844355 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccrws"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.845408 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_52f1329e-e0dc-4295-8dfc-92fae02065f2/ovsdbserver-nb/0.log" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.845447 4735 generic.go:334] "Generic (PLEG): container finished" podID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerID="57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.846195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f1329e-e0dc-4295-8dfc-92fae02065f2","Type":"ContainerDied","Data":"57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.846231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"52f1329e-e0dc-4295-8dfc-92fae02065f2","Type":"ContainerDied","Data":"0bb524772fddde9f18801bfeaa6d2817ae97294471e32ba726b187a9ff636401"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.846303 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.857304 4735 generic.go:334] "Generic (PLEG): container finished" podID="1db01523-cd76-4a9d-8063-e024dd919879" containerID="6fcbdbcd2a5b227d1b93c5af337c1732e6685ef4e33f54e3bddf01cee27e6906" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.857399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1db01523-cd76-4a9d-8063-e024dd919879","Type":"ContainerDied","Data":"6fcbdbcd2a5b227d1b93c5af337c1732e6685ef4e33f54e3bddf01cee27e6906"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.863527 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.863779 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="8953f639-e468-4f85-8093-6c29fa380d8a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.877395 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7sl5"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.877950 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" event={"ID":"1f333daa-559f-421a-b0a9-fa1de98b72cf","Type":"ContainerDied","Data":"708a497a7fe6cec61076ba788b81bcb2755d35d47993f250f92891d45b78b2b9"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.878012 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-7gpr6" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.905377 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.908723 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="84fa39bb-f66a-49ed-960c-7598ec87854a" containerName="nova-cell0-conductor-conductor" containerID="cri-o://eb3a0242ded2d9f65f661e234e3d416b8c948c5e34c221a07c2a3c7ce662b2fb" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.909995 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-xjdpv"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.931602 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-xjdpv"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.940696 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc85c-account-delete-56lmd"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.948767 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956065 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956120 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956135 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956144 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956153 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956160 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956168 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956175 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956183 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956191 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956198 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956207 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956215 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956223 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6" exitCode=0 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956365 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956376 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.956461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.967613 4735 generic.go:334] "Generic (PLEG): container finished" podID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerID="205355fb6f24a0939888db6199eeaa319bf35bb6b065a98fc683f3a28c8fa854" exitCode=143 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.967677 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97384a50-9f20-4cbf-8b02-afffe5ddc791","Type":"ContainerDied","Data":"205355fb6f24a0939888db6199eeaa319bf35bb6b065a98fc683f3a28c8fa854"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.974038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbkkt" event={"ID":"73b19a4f-4794-49fc-bfd9-727618f11e7c","Type":"ContainerDied","Data":"6000cbfb80846d28ff664946af0c6e79f1d4bfc29b423770d8f89142dd6fe3d3"} Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.974147 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbkkt" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.980456 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.980759 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="53890bab-61cc-4f19-908a-33c085da8d33" containerName="nova-scheduler-scheduler" containerID="cri-o://7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49" gracePeriod=30 Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.986708 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:21 crc kubenswrapper[4735]: I1008 13:19:21.989949 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement364a-account-delete-dg8rs"] Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.022682 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell066dd-account-delete-wnksz"] Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.031816 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance55cc-account-delete-cnr66"] Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.037850 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "27af208c-711b-4423-bae1-91b55e84e14a" (UID: "27af208c-711b-4423-bae1-91b55e84e14a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.056682 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="rabbitmq" containerID="cri-o://18cff9aa331d4910576ccc409a19da98fba4e35b7118162bd1347fc1d99cf4af" gracePeriod=604800 Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.102348 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/27af208c-711b-4423-bae1-91b55e84e14a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.130348 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "647cfc89-264d-4616-ab81-64a7c9aa51c5" (UID: "647cfc89-264d-4616-ab81-64a7c9aa51c5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.209709 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/647cfc89-264d-4616-ab81-64a7c9aa51c5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.209807 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.209861 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data podName:644f0099-ec9c-42af-bbe8-54fe5b603bb3 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:26.209844489 +0000 UTC m=+1416.018149033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data") pod "rabbitmq-server-0" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3") : configmap "rabbitmq-config-data" not found Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.214731 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.312092 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.313717 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.313746 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.357486 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "52f1329e-e0dc-4295-8dfc-92fae02065f2" (UID: "52f1329e-e0dc-4295-8dfc-92fae02065f2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.367692 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.368923 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.370069 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.370179 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.370206 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.391404 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.398074 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.398188 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.417687 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f1329e-e0dc-4295-8dfc-92fae02065f2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.589363 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4189bc5a_2f23_4b53_9df4_e97a65a7d8eb.slice/crio-conmon-f91e04b6239fa23dd178148df72ab2a06405051f7c6cb712411b935fbe3a9052.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5516bd19_016a_4d52_84c9_b3801e1841ca.slice/crio-conmon-eb379c44e7f01a76229af5fa811391aa9b991614e3a1cc0d30c5cec76be16709.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4189bc5a_2f23_4b53_9df4_e97a65a7d8eb.slice/crio-f91e04b6239fa23dd178148df72ab2a06405051f7c6cb712411b935fbe3a9052.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2868ea84_fef0_4728_9a62_e2ba97a9aae1.slice/crio-c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e.scope\": RecentStats: unable to find data in memory cache]" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.727179 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="31465d06-bf56-4173-95c1-3c13d776b1f9" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.195:6080/vnc_lite.html\": dial tcp 10.217.0.195:6080: connect: connection refused" Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.754104 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:22 crc kubenswrapper[4735]: E1008 13:19:22.754177 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data podName:01ad2819-cfc3-4416-8a27-ecb65c2fa366 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:26.754159608 +0000 UTC m=+1416.562464152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data") pod "rabbitmq-cell1-server-0" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366") : configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.934862 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc" path="/var/lib/kubelet/pods/0b3632f1-e8ac-49f8-8cfe-c54bcf7b86bc/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.935368 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e985c68-b9b2-4bb0-bf30-85abd2ea3c69" path="/var/lib/kubelet/pods/0e985c68-b9b2-4bb0-bf30-85abd2ea3c69/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.935834 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4ee7a9-f521-442b-8c8e-354fbc7dbbda" path="/var/lib/kubelet/pods/1d4ee7a9-f521-442b-8c8e-354fbc7dbbda/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.936296 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a67a90-368f-48f4-809b-6db81964983e" path="/var/lib/kubelet/pods/20a67a90-368f-48f4-809b-6db81964983e/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.947502 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27af208c-711b-4423-bae1-91b55e84e14a" path="/var/lib/kubelet/pods/27af208c-711b-4423-bae1-91b55e84e14a/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.947968 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c371503-0a4b-4fdb-90c8-49bd2bef645e" path="/var/lib/kubelet/pods/2c371503-0a4b-4fdb-90c8-49bd2bef645e/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.948711 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e294ddd-2861-4cfb-97b5-d4283cc37e4e" path="/var/lib/kubelet/pods/3e294ddd-2861-4cfb-97b5-d4283cc37e4e/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.949365 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8" path="/var/lib/kubelet/pods/5bc2b0c9-48ad-4541-8f5e-71e6929a2aa8/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.950727 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6056bb74-3465-4e20-8b68-877132ab11fb" path="/var/lib/kubelet/pods/6056bb74-3465-4e20-8b68-877132ab11fb/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.951244 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66bc99ab-4fa6-447b-8dbb-b612ed9e0036" path="/var/lib/kubelet/pods/66bc99ab-4fa6-447b-8dbb-b612ed9e0036/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.951769 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72461694-86b9-4133-9970-302416099748" path="/var/lib/kubelet/pods/72461694-86b9-4133-9970-302416099748/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.952899 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826e70f1-e543-4a90-93ba-a671ee1767b2" path="/var/lib/kubelet/pods/826e70f1-e543-4a90-93ba-a671ee1767b2/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.953346 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85383db5-f6cc-433b-b8d9-7b5397aeadd4" path="/var/lib/kubelet/pods/85383db5-f6cc-433b-b8d9-7b5397aeadd4/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.953941 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b452547c-5bac-4df5-a0e6-d969416877fc" path="/var/lib/kubelet/pods/b452547c-5bac-4df5-a0e6-d969416877fc/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.954774 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d95702-991f-4f45-94e3-45650c900b1a" path="/var/lib/kubelet/pods/b5d95702-991f-4f45-94e3-45650c900b1a/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.956098 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c138807f-c003-44cc-ae0e-2491ba8602df" path="/var/lib/kubelet/pods/c138807f-c003-44cc-ae0e-2491ba8602df/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.956634 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83" path="/var/lib/kubelet/pods/df3ac5b8-1365-49a1-8a4e-0fa11bd0cd83/volumes" Oct 08 13:19:22 crc kubenswrapper[4735]: I1008 13:19:22.957132 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e203cdd5-6f7f-4e2e-9be1-3d467c6005f1" path="/var/lib/kubelet/pods/e203cdd5-6f7f-4e2e-9be1-3d467c6005f1/volumes" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.016459 4735 generic.go:334] "Generic (PLEG): container finished" podID="ca78485a-0626-420e-aa59-a9a4680df83c" containerID="719ed4d6e6ca3be4b585ba42337fe3c6bf926775f7cb265506efe244531bac65" exitCode=143 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.016633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd5468f46-ccf7d" event={"ID":"ca78485a-0626-420e-aa59-a9a4680df83c","Type":"ContainerDied","Data":"719ed4d6e6ca3be4b585ba42337fe3c6bf926775f7cb265506efe244531bac65"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.045691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement364a-account-delete-dg8rs" event={"ID":"6fe2bc1d-7e65-45ba-9d42-cae899859258","Type":"ContainerStarted","Data":"dd1f9def7f6ada112889bfee6873367609de38290aa5db020e0c8fdfbe703326"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.066236 4735 generic.go:334] "Generic (PLEG): container finished" podID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerID="c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.066328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2868ea84-fef0-4728-9a62-e2ba97a9aae1","Type":"ContainerDied","Data":"c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.068453 4735 generic.go:334] "Generic (PLEG): container finished" podID="8953f639-e468-4f85-8093-6c29fa380d8a" containerID="fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.068550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8953f639-e468-4f85-8093-6c29fa380d8a","Type":"ContainerDied","Data":"fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.072196 4735 generic.go:334] "Generic (PLEG): container finished" podID="4189bc5a-2f23-4b53-9df4-e97a65a7d8eb" containerID="f91e04b6239fa23dd178148df72ab2a06405051f7c6cb712411b935fbe3a9052" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.072264 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4fb6-account-delete-dvvxd" event={"ID":"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb","Type":"ContainerDied","Data":"f91e04b6239fa23dd178148df72ab2a06405051f7c6cb712411b935fbe3a9052"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.116348 4735 generic.go:334] "Generic (PLEG): container finished" podID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerID="eb379c44e7f01a76229af5fa811391aa9b991614e3a1cc0d30c5cec76be16709" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.116388 4735 generic.go:334] "Generic (PLEG): container finished" podID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerID="c41fba7730a1ea0efa3f14c62882c27ae55360beb38b9829009593be86e45cb8" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.116463 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-675db45d77-s6pvt" event={"ID":"5516bd19-016a-4d52-84c9-b3801e1841ca","Type":"ContainerDied","Data":"eb379c44e7f01a76229af5fa811391aa9b991614e3a1cc0d30c5cec76be16709"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.116510 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-675db45d77-s6pvt" event={"ID":"5516bd19-016a-4d52-84c9-b3801e1841ca","Type":"ContainerDied","Data":"c41fba7730a1ea0efa3f14c62882c27ae55360beb38b9829009593be86e45cb8"} Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.126729 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49 is running failed: container process not found" containerID="7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.129464 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49 is running failed: container process not found" containerID="7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.130377 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49 is running failed: container process not found" containerID="7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.130408 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="53890bab-61cc-4f19-908a-33c085da8d33" containerName="nova-scheduler-scheduler" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.130659 4735 generic.go:334] "Generic (PLEG): container finished" podID="0172640d-d560-4ef0-a7e4-b649ccfa92e4" containerID="650fdbee91d8f7c999b8abf663b1b798642af3b4a56514b0fb884a65ddb1a06f" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.130845 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc85c-account-delete-56lmd" event={"ID":"0172640d-d560-4ef0-a7e4-b649ccfa92e4","Type":"ContainerDied","Data":"650fdbee91d8f7c999b8abf663b1b798642af3b4a56514b0fb884a65ddb1a06f"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.136987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance55cc-account-delete-cnr66" event={"ID":"f952e88e-fe68-4770-874c-fdee88d43aee","Type":"ContainerStarted","Data":"8dbec95c4ce8c28ba5659417c27c67fb59975963e9a422e3a204e96c0b171072"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.137842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell066dd-account-delete-wnksz" event={"ID":"9d6e2da8-2dc0-4642-b216-218ed61cc813","Type":"ContainerStarted","Data":"f83e4017370d86c1326fb60e5ed66078b6d39a8c2ce5732684f5b16a17e01fc3"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.140267 4735 generic.go:334] "Generic (PLEG): container finished" podID="31465d06-bf56-4173-95c1-3c13d776b1f9" containerID="e6bc8e996460d65d28494717119455e8c9a9619cda09617d493e57e6fb07bac3" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.140321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31465d06-bf56-4173-95c1-3c13d776b1f9","Type":"ContainerDied","Data":"e6bc8e996460d65d28494717119455e8c9a9619cda09617d493e57e6fb07bac3"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.157775 4735 generic.go:334] "Generic (PLEG): container finished" podID="84fa39bb-f66a-49ed-960c-7598ec87854a" containerID="eb3a0242ded2d9f65f661e234e3d416b8c948c5e34c221a07c2a3c7ce662b2fb" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.157883 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"84fa39bb-f66a-49ed-960c-7598ec87854a","Type":"ContainerDied","Data":"eb3a0242ded2d9f65f661e234e3d416b8c948c5e34c221a07c2a3c7ce662b2fb"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.161461 4735 generic.go:334] "Generic (PLEG): container finished" podID="f92cd291-9315-4b44-b318-6f029f253de6" containerID="12338dc1882c13a1adb79c38b8e613ffdac5d41d70ba8479854eb311be932cc8" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.161553 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron936c-account-delete-hq47l" event={"ID":"f92cd291-9315-4b44-b318-6f029f253de6","Type":"ContainerDied","Data":"12338dc1882c13a1adb79c38b8e613ffdac5d41d70ba8479854eb311be932cc8"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.164459 4735 generic.go:334] "Generic (PLEG): container finished" podID="ada58119-3f96-4962-b214-fb2662c3d935" containerID="6d64df6cc879792395fce1a8a219ef400d1f7cfe418d3ee5312fec24e31fc592" exitCode=0 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.164561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cc55c9557-96kwz" event={"ID":"ada58119-3f96-4962-b214-fb2662c3d935","Type":"ContainerDied","Data":"6d64df6cc879792395fce1a8a219ef400d1f7cfe418d3ee5312fec24e31fc592"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.164593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cc55c9557-96kwz" event={"ID":"ada58119-3f96-4962-b214-fb2662c3d935","Type":"ContainerDied","Data":"71f900631d93d0716e11248ae62d2fde6fcc02fcdbcce25da12c4a4c029488c5"} Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.164607 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f900631d93d0716e11248ae62d2fde6fcc02fcdbcce25da12c4a4c029488c5" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.201394 4735 scope.go:117] "RemoveContainer" containerID="a1a74be40986832ce5bc77d779a7f7c5cc2c984405440c30e09b161c8492e341" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.268024 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.293008 4735 scope.go:117] "RemoveContainer" containerID="ce728a0dd44fff56f85bc83d0ecfb0de2a06e0305143dfb8dc7aa43bf3d0dbe6" Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.302303 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.304032 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.310246 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.310288 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.338806 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.344429 4735 scope.go:117] "RemoveContainer" containerID="72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375231 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-combined-ca-bundle\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada58119-3f96-4962-b214-fb2662c3d935-logs\") pod \"ada58119-3f96-4962-b214-fb2662c3d935\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375336 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-operator-scripts\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375429 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-combined-ca-bundle\") pod \"ada58119-3f96-4962-b214-fb2662c3d935\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375470 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-secrets\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kolla-config\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-galera-tls-certs\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375628 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data-custom\") pod \"ada58119-3f96-4962-b214-fb2662c3d935\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375700 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xftvs\" (UniqueName: \"kubernetes.io/projected/ada58119-3f96-4962-b214-fb2662c3d935-kube-api-access-xftvs\") pod \"ada58119-3f96-4962-b214-fb2662c3d935\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375752 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-default\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qpb\" (UniqueName: \"kubernetes.io/projected/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kube-api-access-75qpb\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375861 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data\") pod \"ada58119-3f96-4962-b214-fb2662c3d935\" (UID: \"ada58119-3f96-4962-b214-fb2662c3d935\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375886 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-generated\") pod \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\" (UID: \"2868ea84-fef0-4728-9a62-e2ba97a9aae1\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.375980 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada58119-3f96-4962-b214-fb2662c3d935-logs" (OuterVolumeSpecName: "logs") pod "ada58119-3f96-4962-b214-fb2662c3d935" (UID: "ada58119-3f96-4962-b214-fb2662c3d935"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.376064 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.376335 4735 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.376349 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada58119-3f96-4962-b214-fb2662c3d935-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.376595 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.378521 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.380413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.383457 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kube-api-access-75qpb" (OuterVolumeSpecName: "kube-api-access-75qpb") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "kube-api-access-75qpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.384314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ada58119-3f96-4962-b214-fb2662c3d935" (UID: "ada58119-3f96-4962-b214-fb2662c3d935"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.397461 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-secrets" (OuterVolumeSpecName: "secrets") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.401557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada58119-3f96-4962-b214-fb2662c3d935-kube-api-access-xftvs" (OuterVolumeSpecName: "kube-api-access-xftvs") pod "ada58119-3f96-4962-b214-fb2662c3d935" (UID: "ada58119-3f96-4962-b214-fb2662c3d935"). InnerVolumeSpecName "kube-api-access-xftvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.408112 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf is running failed: container process not found" containerID="fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.410004 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf is running failed: container process not found" containerID="fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.410467 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf is running failed: container process not found" containerID="fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.410515 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="8953f639-e468-4f85-8093-6c29fa380d8a" containerName="nova-cell1-conductor-conductor" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.425878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.434938 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.469003 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada58119-3f96-4962-b214-fb2662c3d935" (UID: "ada58119-3f96-4962-b214-fb2662c3d935"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.477731 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-combined-ca-bundle\") pod \"31465d06-bf56-4173-95c1-3c13d776b1f9\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.477855 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-nova-novncproxy-tls-certs\") pod \"31465d06-bf56-4173-95c1-3c13d776b1f9\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.477917 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqrgf\" (UniqueName: \"kubernetes.io/projected/31465d06-bf56-4173-95c1-3c13d776b1f9-kube-api-access-jqrgf\") pod \"31465d06-bf56-4173-95c1-3c13d776b1f9\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.477954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-config-data\") pod \"31465d06-bf56-4173-95c1-3c13d776b1f9\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478039 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-vencrypt-tls-certs\") pod \"31465d06-bf56-4173-95c1-3c13d776b1f9\" (UID: \"31465d06-bf56-4173-95c1-3c13d776b1f9\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478815 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478831 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478840 4735 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478849 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478858 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xftvs\" (UniqueName: \"kubernetes.io/projected/ada58119-3f96-4962-b214-fb2662c3d935-kube-api-access-xftvs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478881 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478892 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478903 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qpb\" (UniqueName: \"kubernetes.io/projected/2868ea84-fef0-4728-9a62-e2ba97a9aae1-kube-api-access-75qpb\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.478913 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2868ea84-fef0-4728-9a62-e2ba97a9aae1-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.481555 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.488072 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pbkkt"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.495553 4735 scope.go:117] "RemoveContainer" containerID="57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.501353 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pbkkt"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.508887 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31465d06-bf56-4173-95c1-3c13d776b1f9-kube-api-access-jqrgf" (OuterVolumeSpecName: "kube-api-access-jqrgf") pod "31465d06-bf56-4173-95c1-3c13d776b1f9" (UID: "31465d06-bf56-4173-95c1-3c13d776b1f9"). InnerVolumeSpecName "kube-api-access-jqrgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.516633 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.546080 4735 scope.go:117] "RemoveContainer" containerID="72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b" Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.547665 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b\": container with ID starting with 72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b not found: ID does not exist" containerID="72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.550417 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b"} err="failed to get container status \"72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b\": rpc error: code = NotFound desc = could not find container \"72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b\": container with ID starting with 72400ada84317756c3507ee2deab0d0b9d034a28342458ff546cbf1cdfbc990b not found: ID does not exist" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.551105 4735 scope.go:117] "RemoveContainer" containerID="57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736" Oct 08 13:19:23 crc kubenswrapper[4735]: E1008 13:19:23.552434 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736\": container with ID starting with 57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736 not found: ID does not exist" containerID="57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.552612 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736"} err="failed to get container status \"57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736\": rpc error: code = NotFound desc = could not find container \"57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736\": container with ID starting with 57cbc0b90f3af7903b01552834d11c94781f0f718d8c5c7767e3069c71b6a736 not found: ID does not exist" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.552738 4735 scope.go:117] "RemoveContainer" containerID="1c0e8f5f9cd61eb363c77155195f6cdd017ff1a8da9e953758ecbda931689b32" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.562861 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-7gpr6"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.584399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-public-tls-certs\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.584544 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jthzs\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-kube-api-access-jthzs\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.584603 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-config-data\") pod \"8953f639-e468-4f85-8093-6c29fa380d8a\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.584743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.584778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-run-httpd\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.584933 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-etc-swift\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.585019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-internal-tls-certs\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.585076 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtvbf\" (UniqueName: \"kubernetes.io/projected/8953f639-e468-4f85-8093-6c29fa380d8a-kube-api-access-vtvbf\") pod \"8953f639-e468-4f85-8093-6c29fa380d8a\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.585102 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-log-httpd\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.585177 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-config-data\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.585208 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-combined-ca-bundle\") pod \"8953f639-e468-4f85-8093-6c29fa380d8a\" (UID: \"8953f639-e468-4f85-8093-6c29fa380d8a\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.586047 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqrgf\" (UniqueName: \"kubernetes.io/projected/31465d06-bf56-4173-95c1-3c13d776b1f9-kube-api-access-jqrgf\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.595380 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.597538 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.602831 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.604179 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-7gpr6"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.618357 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.629696 4735 scope.go:117] "RemoveContainer" containerID="badbfb090802af15f374dcb0a7e9ec38f6812bb6da7f93af0cc1dcf2371d9a2c" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.635375 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.643540 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-bl6gn"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.647859 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.649574 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-bl6gn"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.649579 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-kube-api-access-jthzs" (OuterVolumeSpecName: "kube-api-access-jthzs") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "kube-api-access-jthzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.662395 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron936c-account-delete-hq47l"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.665243 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8953f639-e468-4f85-8093-6c29fa380d8a-kube-api-access-vtvbf" (OuterVolumeSpecName: "kube-api-access-vtvbf") pod "8953f639-e468-4f85-8093-6c29fa380d8a" (UID: "8953f639-e468-4f85-8093-6c29fa380d8a"). InnerVolumeSpecName "kube-api-access-vtvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.670584 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-936c-account-create-rt7rb"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.678093 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-936c-account-create-rt7rb"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.687397 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.687420 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.687429 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.687438 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtvbf\" (UniqueName: \"kubernetes.io/projected/8953f639-e468-4f85-8093-6c29fa380d8a-kube-api-access-vtvbf\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.687446 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5516bd19-016a-4d52-84c9-b3801e1841ca-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.687455 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jthzs\" (UniqueName: \"kubernetes.io/projected/5516bd19-016a-4d52-84c9-b3801e1841ca-kube-api-access-jthzs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.691115 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.755317 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.786406 4735 scope.go:117] "RemoveContainer" containerID="d30b6d8ff48994cf5b030b6578cb0bfd0a226d0320a05c48eaa37fda8c089985" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.791809 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-combined-ca-bundle\") pod \"84fa39bb-f66a-49ed-960c-7598ec87854a\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.792027 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb57t\" (UniqueName: \"kubernetes.io/projected/84fa39bb-f66a-49ed-960c-7598ec87854a-kube-api-access-hb57t\") pod \"84fa39bb-f66a-49ed-960c-7598ec87854a\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.792178 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-config-data\") pod \"84fa39bb-f66a-49ed-960c-7598ec87854a\" (UID: \"84fa39bb-f66a-49ed-960c-7598ec87854a\") " Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.792808 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.808078 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.808898 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="sg-core" containerID="cri-o://8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf" gracePeriod=30 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.809008 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="proxy-httpd" containerID="cri-o://ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523" gracePeriod=30 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.809057 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-notification-agent" containerID="cri-o://b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a" gracePeriod=30 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.809068 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-central-agent" containerID="cri-o://0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8" gracePeriod=30 Oct 08 13:19:23 crc kubenswrapper[4735]: I1008 13:19:23.837969 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84fa39bb-f66a-49ed-960c-7598ec87854a-kube-api-access-hb57t" (OuterVolumeSpecName: "kube-api-access-hb57t") pod "84fa39bb-f66a-49ed-960c-7598ec87854a" (UID: "84fa39bb-f66a-49ed-960c-7598ec87854a"). InnerVolumeSpecName "kube-api-access-hb57t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:23.901899 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:23.902199 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" containerName="kube-state-metrics" containerID="cri-o://ff75cfda3a642f58adcba7bc55060219ea312fa63294148521fb3e2d8da99ab9" gracePeriod=30 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:23.918199 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb57t\" (UniqueName: \"kubernetes.io/projected/84fa39bb-f66a-49ed-960c-7598ec87854a-kube-api-access-hb57t\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:23.973759 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:23.974300 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="f6927451-d206-4948-a43e-5b4ee59dddcc" containerName="memcached" containerID="cri-o://9fd6286f631f71ba317c16f921eb7e0bc5df5010bec0980750f899b1396c8b6e" gracePeriod=30 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.071748 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8ht8g"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.093868 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6cx9z"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.102178 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.174:8776/healthcheck\": read tcp 10.217.0.2:33402->10.217.0.174:8776: read: connection reset by peer" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.109435 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8ht8g"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.120439 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6cx9z"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.130820 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-77f649d57f-mlpp4"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.131007 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-77f649d57f-mlpp4" podUID="477f295c-5e96-4f47-8caa-f7d36b074cdd" containerName="keystone-api" containerID="cri-o://682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f" gracePeriod=30 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.151126 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.182894 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4rmr9"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.206761 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4rmr9"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.225343 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4fa4-account-create-k5c9z"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.226100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2868ea84-fef0-4728-9a62-e2ba97a9aae1" (UID: "2868ea84-fef0-4728-9a62-e2ba97a9aae1"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.226296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc85c-account-delete-56lmd" event={"ID":"0172640d-d560-4ef0-a7e4-b649ccfa92e4","Type":"ContainerDied","Data":"3417eaa999269a1fec9d77454c9b363cdcab58e3009ef984c032ae02c37c88d0"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.226342 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3417eaa999269a1fec9d77454c9b363cdcab58e3009ef984c032ae02c37c88d0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.234923 4735 generic.go:334] "Generic (PLEG): container finished" podID="53890bab-61cc-4f19-908a-33c085da8d33" containerID="7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.235013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53890bab-61cc-4f19-908a-33c085da8d33","Type":"ContainerDied","Data":"7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.235044 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53890bab-61cc-4f19-908a-33c085da8d33","Type":"ContainerDied","Data":"45587d9115eeb1cbcf8295e2c9ae350f24824252a2ca6332e888f3b57d8471af"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.235058 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45587d9115eeb1cbcf8295e2c9ae350f24824252a2ca6332e888f3b57d8471af" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.253644 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data" (OuterVolumeSpecName: "config-data") pod "ada58119-3f96-4962-b214-fb2662c3d935" (UID: "ada58119-3f96-4962-b214-fb2662c3d935"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.256860 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement364a-account-delete-dg8rs" podUID="6fe2bc1d-7e65-45ba-9d42-cae899859258" containerName="mariadb-account-delete" containerID="cri-o://4a51c20c2a4142a285ce95ca78c731663396a540ba1f9a28337f2f9d05073c41" gracePeriod=30 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.274224 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8953f639-e468-4f85-8093-6c29fa380d8a","Type":"ContainerDied","Data":"93dba9ad72f36fc0015ef57446883bf6d77cfc2be679a3477a88c6d950bfd50f"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.274345 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.279667 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4fa4-account-create-k5c9z"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.280934 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-config-data" (OuterVolumeSpecName: "config-data") pod "31465d06-bf56-4173-95c1-3c13d776b1f9" (UID: "31465d06-bf56-4173-95c1-3c13d776b1f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.286811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4fb6-account-delete-dvvxd" event={"ID":"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb","Type":"ContainerDied","Data":"8c6fca696eefb52e052a4ad3785a57e0494a1c06e43a5610d33e07d0c65557c9"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.286852 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c6fca696eefb52e052a4ad3785a57e0494a1c06e43a5610d33e07d0c65557c9" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.291589 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-config-data" (OuterVolumeSpecName: "config-data") pod "8953f639-e468-4f85-8093-6c29fa380d8a" (UID: "8953f639-e468-4f85-8093-6c29fa380d8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.293487 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement364a-account-delete-dg8rs" podStartSLOduration=6.293452862 podStartE2EDuration="6.293452862s" podCreationTimestamp="2025-10-08 13:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:19:24.274955112 +0000 UTC m=+1414.083259666" watchObservedRunningTime="2025-10-08 13:19:24.293452862 +0000 UTC m=+1414.101757406" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.312117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance55cc-account-delete-cnr66" event={"ID":"f952e88e-fe68-4770-874c-fdee88d43aee","Type":"ContainerStarted","Data":"ab9591fb677116af7fae955e0ecd4bc1b2ddd5209f1c6da40664145a2dcac0e5"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.312355 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance55cc-account-delete-cnr66" podUID="f952e88e-fe68-4770-874c-fdee88d43aee" containerName="mariadb-account-delete" containerID="cri-o://ab9591fb677116af7fae955e0ecd4bc1b2ddd5209f1c6da40664145a2dcac0e5" gracePeriod=30 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.312718 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31465d06-bf56-4173-95c1-3c13d776b1f9" (UID: "31465d06-bf56-4173-95c1-3c13d776b1f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.325970 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell066dd-account-delete-wnksz" event={"ID":"9d6e2da8-2dc0-4642-b216-218ed61cc813","Type":"ContainerStarted","Data":"d70b201742b9a825e3e7cf33e617c61556b91f285f8bf4e0c3a1ee1d89743db4"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328017 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328042 4735 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2868ea84-fef0-4728-9a62-e2ba97a9aae1-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328056 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328067 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328077 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada58119-3f96-4962-b214-fb2662c3d935-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328863 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"84fa39bb-f66a-49ed-960c-7598ec87854a","Type":"ContainerDied","Data":"65961e73fd2089c78d050c0a8560cbc4a4a014886313015a340f2a2690973e3b"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.328982 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.342734 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance55cc-account-delete-cnr66" podStartSLOduration=6.342710951 podStartE2EDuration="6.342710951s" podCreationTimestamp="2025-10-08 13:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:19:24.330992307 +0000 UTC m=+1414.139296861" watchObservedRunningTime="2025-10-08 13:19:24.342710951 +0000 UTC m=+1414.151015495" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.344429 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerID="8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf" exitCode=2 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.344557 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerDied","Data":"8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.367346 4735 generic.go:334] "Generic (PLEG): container finished" podID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" containerID="ff75cfda3a642f58adcba7bc55060219ea312fa63294148521fb3e2d8da99ab9" exitCode=2 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.367431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a","Type":"ContainerDied","Data":"ff75cfda3a642f58adcba7bc55060219ea312fa63294148521fb3e2d8da99ab9"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.389121 4735 generic.go:334] "Generic (PLEG): container finished" podID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerID="74e1faa845324974182d87cae1bcb7ea97976a1ed4553d25c3e02f8144d91a7f" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.389228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8555f7d884-2km9p" event={"ID":"70450186-9a74-4ba9-92f9-3f3a9626d0a0","Type":"ContainerDied","Data":"74e1faa845324974182d87cae1bcb7ea97976a1ed4553d25c3e02f8144d91a7f"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.392659 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:44100->10.217.0.200:8775: read: connection reset by peer" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.392664 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:44112->10.217.0.200:8775: read: connection reset by peer" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.399320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.399592 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-675db45d77-s6pvt" event={"ID":"5516bd19-016a-4d52-84c9-b3801e1841ca","Type":"ContainerDied","Data":"17a3af001ccddc82de589584693b2639a7495ab66189f505abc7e05707f5a7d2"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.399725 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-675db45d77-s6pvt" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.421718 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84fa39bb-f66a-49ed-960c-7598ec87854a" (UID: "84fa39bb-f66a-49ed-960c-7598ec87854a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.428250 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-config-data" (OuterVolumeSpecName: "config-data") pod "84fa39bb-f66a-49ed-960c-7598ec87854a" (UID: "84fa39bb-f66a-49ed-960c-7598ec87854a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.428546 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.429151 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle\") pod \"5516bd19-016a-4d52-84c9-b3801e1841ca\" (UID: \"5516bd19-016a-4d52-84c9-b3801e1841ca\") " Oct 08 13:19:25 crc kubenswrapper[4735]: W1008 13:19:24.429654 4735 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5516bd19-016a-4d52-84c9-b3801e1841ca/volumes/kubernetes.io~secret/combined-ca-bundle Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.429672 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.429679 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.429689 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.429698 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa39bb-f66a-49ed-960c-7598ec87854a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.432252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31465d06-bf56-4173-95c1-3c13d776b1f9","Type":"ContainerDied","Data":"14e85f27e0b1cd9a858c207c3b1d54ff367ef60813c5473bf5978b40bd34aaf0"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.432342 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.434419 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8953f639-e468-4f85-8093-6c29fa380d8a" (UID: "8953f639-e468-4f85-8093-6c29fa380d8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.437623 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.441864 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2868ea84-fef0-4728-9a62-e2ba97a9aae1","Type":"ContainerDied","Data":"f8cf2dd4fa24e9e5269420c461c2aa5154e09910f6f203aae725440d3928dc0e"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.441979 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.446366 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-config-data" (OuterVolumeSpecName: "config-data") pod "5516bd19-016a-4d52-84c9-b3801e1841ca" (UID: "5516bd19-016a-4d52-84c9-b3801e1841ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.447698 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6cc55c9557-96kwz" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.478400 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "31465d06-bf56-4173-95c1-3c13d776b1f9" (UID: "31465d06-bf56-4173-95c1-3c13d776b1f9"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.511530 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerName="galera" containerID="cri-o://d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52" gracePeriod=30 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.511663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "31465d06-bf56-4173-95c1-3c13d776b1f9" (UID: "31465d06-bf56-4173-95c1-3c13d776b1f9"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.532498 4735 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.532531 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.532540 4735 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/31465d06-bf56-4173-95c1-3c13d776b1f9-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.532548 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.532560 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5516bd19-016a-4d52-84c9-b3801e1841ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.532568 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8953f639-e468-4f85-8093-6c29fa380d8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.789060 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd5468f46-ccf7d" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:57086->10.217.0.160:9311: read: connection reset by peer" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.789129 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd5468f46-ccf7d" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:57090->10.217.0.160:9311: read: connection reset by peer" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.880073 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.880121 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.908829 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08919c9d-5826-4fdd-b4ea-e3d5ef927cc2" path="/var/lib/kubelet/pods/08919c9d-5826-4fdd-b4ea-e3d5ef927cc2/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.909364 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8f3c97-c358-4c3f-93fb-063e28ea1ccd" path="/var/lib/kubelet/pods/0d8f3c97-c358-4c3f-93fb-063e28ea1ccd/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.910006 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" path="/var/lib/kubelet/pods/1f333daa-559f-421a-b0a9-fa1de98b72cf/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.911246 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" path="/var/lib/kubelet/pods/52f1329e-e0dc-4295-8dfc-92fae02065f2/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.911860 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579bb7b5-6f36-4ea3-a772-eb749195cf8b" path="/var/lib/kubelet/pods/579bb7b5-6f36-4ea3-a772-eb749195cf8b/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.912413 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" path="/var/lib/kubelet/pods/73b19a4f-4794-49fc-bfd9-727618f11e7c/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.913634 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4" path="/var/lib/kubelet/pods/9b3f41c8-2dbe-4c57-8068-3db0ae0e44c4/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.914248 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b264ac2b-df16-4ddb-94ee-9d346b374ec7" path="/var/lib/kubelet/pods/b264ac2b-df16-4ddb-94ee-9d346b374ec7/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:24.914753 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a15a0b-b0a1-4781-be73-477726f4007c" path="/var/lib/kubelet/pods/e3a15a0b-b0a1-4781-be73-477726f4007c/volumes" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.335827 4735 scope.go:117] "RemoveContainer" containerID="fadbbd45ef112a85da8685a542a9f033c1f9da61e0a242aea8894a27eef0aecf" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.340891 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.410815 4735 scope.go:117] "RemoveContainer" containerID="eb3a0242ded2d9f65f661e234e3d416b8c948c5e34c221a07c2a3c7ce662b2fb" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.424358 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.435196 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.448042 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-config-data\") pod \"53890bab-61cc-4f19-908a-33c085da8d33\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.448085 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6csk2\" (UniqueName: \"kubernetes.io/projected/53890bab-61cc-4f19-908a-33c085da8d33-kube-api-access-6csk2\") pod \"53890bab-61cc-4f19-908a-33c085da8d33\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.448197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-combined-ca-bundle\") pod \"53890bab-61cc-4f19-908a-33c085da8d33\" (UID: \"53890bab-61cc-4f19-908a-33c085da8d33\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.449233 4735 scope.go:117] "RemoveContainer" containerID="eb379c44e7f01a76229af5fa811391aa9b991614e3a1cc0d30c5cec76be16709" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.496625 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53890bab-61cc-4f19-908a-33c085da8d33-kube-api-access-6csk2" (OuterVolumeSpecName: "kube-api-access-6csk2") pod "53890bab-61cc-4f19-908a-33c085da8d33" (UID: "53890bab-61cc-4f19-908a-33c085da8d33"). InnerVolumeSpecName "kube-api-access-6csk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.503254 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.503254 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.513625 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.524517 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53890bab-61cc-4f19-908a-33c085da8d33" (UID: "53890bab-61cc-4f19-908a-33c085da8d33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.534885 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.539904 4735 generic.go:334] "Generic (PLEG): container finished" podID="f952e88e-fe68-4770-874c-fdee88d43aee" containerID="ab9591fb677116af7fae955e0ecd4bc1b2ddd5209f1c6da40664145a2dcac0e5" exitCode=1 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.540067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance55cc-account-delete-cnr66" event={"ID":"f952e88e-fe68-4770-874c-fdee88d43aee","Type":"ContainerDied","Data":"ab9591fb677116af7fae955e0ecd4bc1b2ddd5209f1c6da40664145a2dcac0e5"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.550394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2sg\" (UniqueName: \"kubernetes.io/projected/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb-kube-api-access-mt2sg\") pod \"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb\" (UID: \"4189bc5a-2f23-4b53-9df4-e97a65a7d8eb\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.550502 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8db7\" (UniqueName: \"kubernetes.io/projected/0172640d-d560-4ef0-a7e4-b649ccfa92e4-kube-api-access-l8db7\") pod \"0172640d-d560-4ef0-a7e4-b649ccfa92e4\" (UID: \"0172640d-d560-4ef0-a7e4-b649ccfa92e4\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.552749 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.552785 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6csk2\" (UniqueName: \"kubernetes.io/projected/53890bab-61cc-4f19-908a-33c085da8d33-kube-api-access-6csk2\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.557430 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerID="ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.557468 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerID="0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.557543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerDied","Data":"ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.557575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerDied","Data":"0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.564363 4735 generic.go:334] "Generic (PLEG): container finished" podID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerID="52bb4221516a6482adaee2563d79681bb7ba3599c014219c89382ac18144524b" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.564630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2","Type":"ContainerDied","Data":"52bb4221516a6482adaee2563d79681bb7ba3599c014219c89382ac18144524b"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.573907 4735 generic.go:334] "Generic (PLEG): container finished" podID="f6927451-d206-4948-a43e-5b4ee59dddcc" containerID="9fd6286f631f71ba317c16f921eb7e0bc5df5010bec0980750f899b1396c8b6e" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.574000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f6927451-d206-4948-a43e-5b4ee59dddcc","Type":"ContainerDied","Data":"9fd6286f631f71ba317c16f921eb7e0bc5df5010bec0980750f899b1396c8b6e"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.575285 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0172640d-d560-4ef0-a7e4-b649ccfa92e4-kube-api-access-l8db7" (OuterVolumeSpecName: "kube-api-access-l8db7") pod "0172640d-d560-4ef0-a7e4-b649ccfa92e4" (UID: "0172640d-d560-4ef0-a7e4-b649ccfa92e4"). InnerVolumeSpecName "kube-api-access-l8db7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.577322 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.577498 4735 generic.go:334] "Generic (PLEG): container finished" podID="1db01523-cd76-4a9d-8063-e024dd919879" containerID="418d5ac0c79b3bbb212fb99f2df15049ca6cc912c6272d8bf5adfab9f972e7cf" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.577565 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1db01523-cd76-4a9d-8063-e024dd919879","Type":"ContainerDied","Data":"418d5ac0c79b3bbb212fb99f2df15049ca6cc912c6272d8bf5adfab9f972e7cf"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.584866 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-config-data" (OuterVolumeSpecName: "config-data") pod "53890bab-61cc-4f19-908a-33c085da8d33" (UID: "53890bab-61cc-4f19-908a-33c085da8d33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.584889 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb-kube-api-access-mt2sg" (OuterVolumeSpecName: "kube-api-access-mt2sg") pod "4189bc5a-2f23-4b53-9df4-e97a65a7d8eb" (UID: "4189bc5a-2f23-4b53-9df4-e97a65a7d8eb"). InnerVolumeSpecName "kube-api-access-mt2sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.585309 4735 generic.go:334] "Generic (PLEG): container finished" podID="9d6e2da8-2dc0-4642-b216-218ed61cc813" containerID="d70b201742b9a825e3e7cf33e617c61556b91f285f8bf4e0c3a1ee1d89743db4" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.585396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell066dd-account-delete-wnksz" event={"ID":"9d6e2da8-2dc0-4642-b216-218ed61cc813","Type":"ContainerDied","Data":"d70b201742b9a825e3e7cf33e617c61556b91f285f8bf4e0c3a1ee1d89743db4"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.589764 4735 generic.go:334] "Generic (PLEG): container finished" podID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerID="7cbf4814a5d8742756a2f0076ccbdc83fc3186c93b8940d1bc61f10728f5e789" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.589852 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f","Type":"ContainerDied","Data":"7cbf4814a5d8742756a2f0076ccbdc83fc3186c93b8940d1bc61f10728f5e789"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.590578 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6cc55c9557-96kwz"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.593141 4735 generic.go:334] "Generic (PLEG): container finished" podID="6fe2bc1d-7e65-45ba-9d42-cae899859258" containerID="4a51c20c2a4142a285ce95ca78c731663396a540ba1f9a28337f2f9d05073c41" exitCode=1 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.593219 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement364a-account-delete-dg8rs" event={"ID":"6fe2bc1d-7e65-45ba-9d42-cae899859258","Type":"ContainerDied","Data":"4a51c20c2a4142a285ce95ca78c731663396a540ba1f9a28337f2f9d05073c41"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.602802 4735 generic.go:334] "Generic (PLEG): container finished" podID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerID="89675ae013264bf40ddb13e38257d27a7ce994542ac6e288c014f3d00f9d0b13" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.602916 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6cc55c9557-96kwz"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.602948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8c473e5-0132-4ee7-a14e-ada0780d358c","Type":"ContainerDied","Data":"89675ae013264bf40ddb13e38257d27a7ce994542ac6e288c014f3d00f9d0b13"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.607040 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.613383 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8555f7d884-2km9p" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.613696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8555f7d884-2km9p" event={"ID":"70450186-9a74-4ba9-92f9-3f3a9626d0a0","Type":"ContainerDied","Data":"4f54928f9a5c55b90c172f1c3b77d119b793f451d107666f7ef41d0a9de1c915"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.615365 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.629049 4735 generic.go:334] "Generic (PLEG): container finished" podID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerID="d6ed228dfc5748efb601fafc99f157a758cb829f235af05f716ef363f89fcc14" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.629167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97384a50-9f20-4cbf-8b02-afffe5ddc791","Type":"ContainerDied","Data":"d6ed228dfc5748efb601fafc99f157a758cb829f235af05f716ef363f89fcc14"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.639369 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-675db45d77-s6pvt"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.643881 4735 generic.go:334] "Generic (PLEG): container finished" podID="ca78485a-0626-420e-aa59-a9a4680df83c" containerID="54ae09e0758dbced8f136de91ec860924f9d0a4a1aa8e35c4a62a56d25f7b734" exitCode=0 Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.643995 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc85c-account-delete-56lmd" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.645712 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.645768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd5468f46-ccf7d" event={"ID":"ca78485a-0626-420e-aa59-a9a4680df83c","Type":"ContainerDied","Data":"54ae09e0758dbced8f136de91ec860924f9d0a4a1aa8e35c4a62a56d25f7b734"} Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.645903 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4fb6-account-delete-dvvxd" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655003 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-scripts\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-internal-tls-certs\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655183 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-config-data\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-combined-ca-bundle\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655252 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdtgs\" (UniqueName: \"kubernetes.io/projected/70450186-9a74-4ba9-92f9-3f3a9626d0a0-kube-api-access-gdtgs\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655289 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-public-tls-certs\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655374 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70450186-9a74-4ba9-92f9-3f3a9626d0a0-logs\") pod \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\" (UID: \"70450186-9a74-4ba9-92f9-3f3a9626d0a0\") " Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655787 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53890bab-61cc-4f19-908a-33c085da8d33-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655805 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2sg\" (UniqueName: \"kubernetes.io/projected/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb-kube-api-access-mt2sg\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.655818 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8db7\" (UniqueName: \"kubernetes.io/projected/0172640d-d560-4ef0-a7e4-b649ccfa92e4-kube-api-access-l8db7\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.659688 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70450186-9a74-4ba9-92f9-3f3a9626d0a0-logs" (OuterVolumeSpecName: "logs") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.670860 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-scripts" (OuterVolumeSpecName: "scripts") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.671461 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70450186-9a74-4ba9-92f9-3f3a9626d0a0-kube-api-access-gdtgs" (OuterVolumeSpecName: "kube-api-access-gdtgs") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "kube-api-access-gdtgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.672687 4735 scope.go:117] "RemoveContainer" containerID="c41fba7730a1ea0efa3f14c62882c27ae55360beb38b9829009593be86e45cb8" Oct 08 13:19:25 crc kubenswrapper[4735]: E1008 13:19:25.700197 4735 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.755702 4735 scope.go:117] "RemoveContainer" containerID="e6bc8e996460d65d28494717119455e8c9a9619cda09617d493e57e6fb07bac3" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.757018 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdtgs\" (UniqueName: \"kubernetes.io/projected/70450186-9a74-4ba9-92f9-3f3a9626d0a0-kube-api-access-gdtgs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.757040 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70450186-9a74-4ba9-92f9-3f3a9626d0a0-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.757050 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.778082 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-675db45d77-s6pvt"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.801696 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.813310 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.830416 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4fb6-account-delete-dvvxd"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.842043 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder4fb6-account-delete-dvvxd"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.865999 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.877503 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.897197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.929965 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-config-data" (OuterVolumeSpecName: "config-data") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.952225 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-scqw7" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" probeResult="failure" output=< Oct 08 13:19:25 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 13:19:25 crc kubenswrapper[4735]: > Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.961742 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.961770 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:25 crc kubenswrapper[4735]: I1008 13:19:25.975206 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.005054 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "70450186-9a74-4ba9-92f9-3f3a9626d0a0" (UID: "70450186-9a74-4ba9-92f9-3f3a9626d0a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.063848 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.063897 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70450186-9a74-4ba9-92f9-3f3a9626d0a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.112742 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.116534 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc85c-account-delete-56lmd"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.131902 4735 scope.go:117] "RemoveContainer" containerID="c0fd4ca14cbd2147c2ea9e79919ae0922f959bc15036da6d763f5b4768a6c66e" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.132242 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.132855 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicanc85c-account-delete-56lmd"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.148547 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.152548 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.163921 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164497 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-combined-ca-bundle\") pod \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164556 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-logs\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-certs\") pod \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164667 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmgjq\" (UniqueName: \"kubernetes.io/projected/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-api-access-dmgjq\") pod \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data-custom\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-etc-machine-id\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164859 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-internal-tls-certs\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164904 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-scripts\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164932 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-public-tls-certs\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.164960 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-combined-ca-bundle\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.165012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.165043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qjl\" (UniqueName: \"kubernetes.io/projected/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-kube-api-access-m5qjl\") pod \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\" (UID: \"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.165070 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-config\") pod \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\" (UID: \"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.167876 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.177961 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.180723 4735 scope.go:117] "RemoveContainer" containerID="7a5a1a14e53c9fb322ae90dae5d7e280ecbb3b21e0fceb0275319ed5fb89b446" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.180894 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-scripts" (OuterVolumeSpecName: "scripts") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.182684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-kube-api-access-m5qjl" (OuterVolumeSpecName: "kube-api-access-m5qjl") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "kube-api-access-m5qjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.184042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-api-access-dmgjq" (OuterVolumeSpecName: "kube-api-access-dmgjq") pod "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" (UID: "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a"). InnerVolumeSpecName "kube-api-access-dmgjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.187411 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-logs" (OuterVolumeSpecName: "logs") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.207597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.230442 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" (UID: "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.236749 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.238984 4735 scope.go:117] "RemoveContainer" containerID="74e1faa845324974182d87cae1bcb7ea97976a1ed4553d25c3e02f8144d91a7f" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.248681 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.258685 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.266277 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-combined-ca-bundle\") pod \"1db01523-cd76-4a9d-8063-e024dd919879\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.266341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db01523-cd76-4a9d-8063-e024dd919879-logs\") pod \"1db01523-cd76-4a9d-8063-e024dd919879\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.266369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data-custom\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.266385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-public-tls-certs\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.266417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lf82\" (UniqueName: \"kubernetes.io/projected/f92cd291-9315-4b44-b318-6f029f253de6-kube-api-access-2lf82\") pod \"f92cd291-9315-4b44-b318-6f029f253de6\" (UID: \"f92cd291-9315-4b44-b318-6f029f253de6\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.268097 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db01523-cd76-4a9d-8063-e024dd919879-logs" (OuterVolumeSpecName: "logs") pod "1db01523-cd76-4a9d-8063-e024dd919879" (UID: "1db01523-cd76-4a9d-8063-e024dd919879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.279705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-internal-tls-certs\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.279775 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-internal-tls-certs\") pod \"97384a50-9f20-4cbf-8b02-afffe5ddc791\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.279819 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-nova-metadata-tls-certs\") pod \"1db01523-cd76-4a9d-8063-e024dd919879\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.279886 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-combined-ca-bundle\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.279946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbkv6\" (UniqueName: \"kubernetes.io/projected/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-kube-api-access-tbkv6\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280045 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-config-data\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280100 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7jr6\" (UniqueName: \"kubernetes.io/projected/ca78485a-0626-420e-aa59-a9a4680df83c-kube-api-access-k7jr6\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280133 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-public-tls-certs\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280210 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97384a50-9f20-4cbf-8b02-afffe5ddc791-logs\") pod \"97384a50-9f20-4cbf-8b02-afffe5ddc791\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280249 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5k48\" (UniqueName: \"kubernetes.io/projected/1db01523-cd76-4a9d-8063-e024dd919879-kube-api-access-g5k48\") pod \"1db01523-cd76-4a9d-8063-e024dd919879\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280296 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-combined-ca-bundle\") pod \"97384a50-9f20-4cbf-8b02-afffe5ddc791\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280320 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82j5p\" (UniqueName: \"kubernetes.io/projected/97384a50-9f20-4cbf-8b02-afffe5ddc791-kube-api-access-82j5p\") pod \"97384a50-9f20-4cbf-8b02-afffe5ddc791\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280507 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-httpd-run\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280551 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca78485a-0626-420e-aa59-a9a4680df83c-logs\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280599 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-config-data\") pod \"97384a50-9f20-4cbf-8b02-afffe5ddc791\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280625 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-logs\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-public-tls-certs\") pod \"97384a50-9f20-4cbf-8b02-afffe5ddc791\" (UID: \"97384a50-9f20-4cbf-8b02-afffe5ddc791\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.280923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-scripts\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.281039 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-config-data\") pod \"1db01523-cd76-4a9d-8063-e024dd919879\" (UID: \"1db01523-cd76-4a9d-8063-e024dd919879\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.281096 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\" (UID: \"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.281192 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-combined-ca-bundle\") pod \"ca78485a-0626-420e-aa59-a9a4680df83c\" (UID: \"ca78485a-0626-420e-aa59-a9a4680df83c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.281596 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97384a50-9f20-4cbf-8b02-afffe5ddc791-logs" (OuterVolumeSpecName: "logs") pod "97384a50-9f20-4cbf-8b02-afffe5ddc791" (UID: "97384a50-9f20-4cbf-8b02-afffe5ddc791"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283262 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283284 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qjl\" (UniqueName: \"kubernetes.io/projected/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-kube-api-access-m5qjl\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283317 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283331 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283341 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1db01523-cd76-4a9d-8063-e024dd919879-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283352 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmgjq\" (UniqueName: \"kubernetes.io/projected/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-api-access-dmgjq\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283363 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283374 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.283405 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97384a50-9f20-4cbf-8b02-afffe5ddc791-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: E1008 13:19:26.283700 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 13:19:26 crc kubenswrapper[4735]: E1008 13:19:26.283777 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data podName:644f0099-ec9c-42af-bbe8-54fe5b603bb3 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:34.283757506 +0000 UTC m=+1424.092062040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data") pod "rabbitmq-server-0" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3") : configmap "rabbitmq-config-data" not found Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.284650 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.284780 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-logs" (OuterVolumeSpecName: "logs") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.287195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca78485a-0626-420e-aa59-a9a4680df83c-logs" (OuterVolumeSpecName: "logs") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.290978 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data" (OuterVolumeSpecName: "config-data") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.291236 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.293152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.295285 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.296571 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-scripts" (OuterVolumeSpecName: "scripts") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.297251 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92cd291-9315-4b44-b318-6f029f253de6-kube-api-access-2lf82" (OuterVolumeSpecName: "kube-api-access-2lf82") pod "f92cd291-9315-4b44-b318-6f029f253de6" (UID: "f92cd291-9315-4b44-b318-6f029f253de6"). InnerVolumeSpecName "kube-api-access-2lf82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.297294 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca78485a-0626-420e-aa59-a9a4680df83c-kube-api-access-k7jr6" (OuterVolumeSpecName: "kube-api-access-k7jr6") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "kube-api-access-k7jr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.298397 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8555f7d884-2km9p"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.299385 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db01523-cd76-4a9d-8063-e024dd919879-kube-api-access-g5k48" (OuterVolumeSpecName: "kube-api-access-g5k48") pod "1db01523-cd76-4a9d-8063-e024dd919879" (UID: "1db01523-cd76-4a9d-8063-e024dd919879"). InnerVolumeSpecName "kube-api-access-g5k48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.300397 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-kube-api-access-tbkv6" (OuterVolumeSpecName: "kube-api-access-tbkv6") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "kube-api-access-tbkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.316670 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" (UID: "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.318808 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8555f7d884-2km9p"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.318944 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.330039 4735 scope.go:117] "RemoveContainer" containerID="c6fb989e32326dc473a47f896284b872f9d86ea0f3de39c8b6cff9255ec1e38f" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.330935 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97384a50-9f20-4cbf-8b02-afffe5ddc791-kube-api-access-82j5p" (OuterVolumeSpecName: "kube-api-access-82j5p") pod "97384a50-9f20-4cbf-8b02-afffe5ddc791" (UID: "97384a50-9f20-4cbf-8b02-afffe5ddc791"). InnerVolumeSpecName "kube-api-access-82j5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.339958 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.343201 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.363581 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.383990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-combined-ca-bundle\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384059 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-config-data\") pod \"f6927451-d206-4948-a43e-5b4ee59dddcc\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384094 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22vkt\" (UniqueName: \"kubernetes.io/projected/b8c473e5-0132-4ee7-a14e-ada0780d358c-kube-api-access-22vkt\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-scripts\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384596 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-config-data\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jchv\" (UniqueName: \"kubernetes.io/projected/f952e88e-fe68-4770-874c-fdee88d43aee-kube-api-access-2jchv\") pod \"f952e88e-fe68-4770-874c-fdee88d43aee\" (UID: \"f952e88e-fe68-4770-874c-fdee88d43aee\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf8zq\" (UniqueName: \"kubernetes.io/projected/6fe2bc1d-7e65-45ba-9d42-cae899859258-kube-api-access-xf8zq\") pod \"6fe2bc1d-7e65-45ba-9d42-cae899859258\" (UID: \"6fe2bc1d-7e65-45ba-9d42-cae899859258\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-httpd-run\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384809 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-kolla-config\") pod \"f6927451-d206-4948-a43e-5b4ee59dddcc\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384838 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-logs\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384869 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-memcached-tls-certs\") pod \"f6927451-d206-4948-a43e-5b4ee59dddcc\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384947 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-combined-ca-bundle\") pod \"f6927451-d206-4948-a43e-5b4ee59dddcc\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-internal-tls-certs\") pod \"b8c473e5-0132-4ee7-a14e-ada0780d358c\" (UID: \"b8c473e5-0132-4ee7-a14e-ada0780d358c\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.384989 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-config-data" (OuterVolumeSpecName: "config-data") pod "f6927451-d206-4948-a43e-5b4ee59dddcc" (UID: "f6927451-d206-4948-a43e-5b4ee59dddcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.385043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qbrb\" (UniqueName: \"kubernetes.io/projected/f6927451-d206-4948-a43e-5b4ee59dddcc-kube-api-access-2qbrb\") pod \"f6927451-d206-4948-a43e-5b4ee59dddcc\" (UID: \"f6927451-d206-4948-a43e-5b4ee59dddcc\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.385251 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386112 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386134 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbkv6\" (UniqueName: \"kubernetes.io/projected/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-kube-api-access-tbkv6\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386177 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7jr6\" (UniqueName: \"kubernetes.io/projected/ca78485a-0626-420e-aa59-a9a4680df83c-kube-api-access-k7jr6\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386191 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5k48\" (UniqueName: \"kubernetes.io/projected/1db01523-cd76-4a9d-8063-e024dd919879-kube-api-access-g5k48\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386204 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82j5p\" (UniqueName: \"kubernetes.io/projected/97384a50-9f20-4cbf-8b02-afffe5ddc791-kube-api-access-82j5p\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386217 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386263 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386276 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386287 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca78485a-0626-420e-aa59-a9a4680df83c-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386297 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386367 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386383 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386424 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386450 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386464 4735 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386515 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.386530 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lf82\" (UniqueName: \"kubernetes.io/projected/f92cd291-9315-4b44-b318-6f029f253de6-kube-api-access-2lf82\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.390951 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f6927451-d206-4948-a43e-5b4ee59dddcc" (UID: "f6927451-d206-4948-a43e-5b4ee59dddcc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.397002 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-logs" (OuterVolumeSpecName: "logs") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.401674 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-scripts" (OuterVolumeSpecName: "scripts") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.401772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" (UID: "b8e2103e-dd2c-4e18-b66b-0555a1c8f43a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.403789 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.405730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6927451-d206-4948-a43e-5b4ee59dddcc-kube-api-access-2qbrb" (OuterVolumeSpecName: "kube-api-access-2qbrb") pod "f6927451-d206-4948-a43e-5b4ee59dddcc" (UID: "f6927451-d206-4948-a43e-5b4ee59dddcc"). InnerVolumeSpecName "kube-api-access-2qbrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.405824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f952e88e-fe68-4770-874c-fdee88d43aee-kube-api-access-2jchv" (OuterVolumeSpecName: "kube-api-access-2jchv") pod "f952e88e-fe68-4770-874c-fdee88d43aee" (UID: "f952e88e-fe68-4770-874c-fdee88d43aee"). InnerVolumeSpecName "kube-api-access-2jchv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.405853 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c473e5-0132-4ee7-a14e-ada0780d358c-kube-api-access-22vkt" (OuterVolumeSpecName: "kube-api-access-22vkt") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "kube-api-access-22vkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.428876 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe2bc1d-7e65-45ba-9d42-cae899859258-kube-api-access-xf8zq" (OuterVolumeSpecName: "kube-api-access-xf8zq") pod "6fe2bc1d-7e65-45ba-9d42-cae899859258" (UID: "6fe2bc1d-7e65-45ba-9d42-cae899859258"). InnerVolumeSpecName "kube-api-access-xf8zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.449954 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-config-data" (OuterVolumeSpecName: "config-data") pod "97384a50-9f20-4cbf-8b02-afffe5ddc791" (UID: "97384a50-9f20-4cbf-8b02-afffe5ddc791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.453013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1db01523-cd76-4a9d-8063-e024dd919879" (UID: "1db01523-cd76-4a9d-8063-e024dd919879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.475267 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.487597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc8z2\" (UniqueName: \"kubernetes.io/projected/9d6e2da8-2dc0-4642-b216-218ed61cc813-kube-api-access-hc8z2\") pod \"9d6e2da8-2dc0-4642-b216-218ed61cc813\" (UID: \"9d6e2da8-2dc0-4642-b216-218ed61cc813\") " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488316 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488363 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488380 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488393 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qbrb\" (UniqueName: \"kubernetes.io/projected/f6927451-d206-4948-a43e-5b4ee59dddcc-kube-api-access-2qbrb\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488407 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22vkt\" (UniqueName: \"kubernetes.io/projected/b8c473e5-0132-4ee7-a14e-ada0780d358c-kube-api-access-22vkt\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488420 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488435 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jchv\" (UniqueName: \"kubernetes.io/projected/f952e88e-fe68-4770-874c-fdee88d43aee-kube-api-access-2jchv\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488446 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf8zq\" (UniqueName: \"kubernetes.io/projected/6fe2bc1d-7e65-45ba-9d42-cae899859258-kube-api-access-xf8zq\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488458 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488468 4735 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6927451-d206-4948-a43e-5b4ee59dddcc-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488498 4735 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.488511 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8c473e5-0132-4ee7-a14e-ada0780d358c-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.497637 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6e2da8-2dc0-4642-b216-218ed61cc813-kube-api-access-hc8z2" (OuterVolumeSpecName: "kube-api-access-hc8z2") pod "9d6e2da8-2dc0-4642-b216-218ed61cc813" (UID: "9d6e2da8-2dc0-4642-b216-218ed61cc813"). InnerVolumeSpecName "kube-api-access-hc8z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.506563 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6927451-d206-4948-a43e-5b4ee59dddcc" (UID: "f6927451-d206-4948-a43e-5b4ee59dddcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.513469 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "f6927451-d206-4948-a43e-5b4ee59dddcc" (UID: "f6927451-d206-4948-a43e-5b4ee59dddcc"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.513617 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.518328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97384a50-9f20-4cbf-8b02-afffe5ddc791" (UID: "97384a50-9f20-4cbf-8b02-afffe5ddc791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.519771 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.522244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.536175 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data" (OuterVolumeSpecName: "config-data") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.541226 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.543413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" (UID: "2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.549567 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.566398 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.569294 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.585102 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1db01523-cd76-4a9d-8063-e024dd919879" (UID: "1db01523-cd76-4a9d-8063-e024dd919879"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.587179 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-config-data" (OuterVolumeSpecName: "config-data") pod "1db01523-cd76-4a9d-8063-e024dd919879" (UID: "1db01523-cd76-4a9d-8063-e024dd919879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.590380 4735 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.590411 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.590424 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.590437 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6927451-d206-4948-a43e-5b4ee59dddcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.590450 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.590461 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593513 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca78485a-0626-420e-aa59-a9a4680df83c" (UID: "ca78485a-0626-420e-aa59-a9a4680df83c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593560 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593607 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc8z2\" (UniqueName: \"kubernetes.io/projected/9d6e2da8-2dc0-4642-b216-218ed61cc813-kube-api-access-hc8z2\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593622 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1db01523-cd76-4a9d-8063-e024dd919879-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593633 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593644 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593657 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593667 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593676 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.593685 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.596557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-config-data" (OuterVolumeSpecName: "config-data") pod "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" (UID: "c0cb7948-9a3d-40fa-b4e7-bb6b765af52f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.602873 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "97384a50-9f20-4cbf-8b02-afffe5ddc791" (UID: "97384a50-9f20-4cbf-8b02-afffe5ddc791"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.604688 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "97384a50-9f20-4cbf-8b02-afffe5ddc791" (UID: "97384a50-9f20-4cbf-8b02-afffe5ddc791"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.616984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-config-data" (OuterVolumeSpecName: "config-data") pod "b8c473e5-0132-4ee7-a14e-ada0780d358c" (UID: "b8c473e5-0132-4ee7-a14e-ada0780d358c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.673082 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6/ovn-northd/0.log" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.673139 4735 generic.go:334] "Generic (PLEG): container finished" podID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerID="af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe" exitCode=139 Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.673232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6","Type":"ContainerDied","Data":"af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.675987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2","Type":"ContainerDied","Data":"6b73e013ac9ec7f447b8c835db5a8692a5faf8e441ae811092f34d1b4454c24f"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.676038 4735 scope.go:117] "RemoveContainer" containerID="52bb4221516a6482adaee2563d79681bb7ba3599c014219c89382ac18144524b" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.676163 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.685373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell066dd-account-delete-wnksz" event={"ID":"9d6e2da8-2dc0-4642-b216-218ed61cc813","Type":"ContainerDied","Data":"f83e4017370d86c1326fb60e5ed66078b6d39a8c2ce5732684f5b16a17e01fc3"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.685516 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell066dd-account-delete-wnksz" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.697519 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.697550 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.697563 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c473e5-0132-4ee7-a14e-ada0780d358c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.697575 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97384a50-9f20-4cbf-8b02-afffe5ddc791-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.697586 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca78485a-0626-420e-aa59-a9a4680df83c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.699488 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron936c-account-delete-hq47l" event={"ID":"f92cd291-9315-4b44-b318-6f029f253de6","Type":"ContainerDied","Data":"33686f02fc026bb542e0f66bc1e14a5c9d2d923213d11e51c7b35053507fd7c5"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.699523 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33686f02fc026bb542e0f66bc1e14a5c9d2d923213d11e51c7b35053507fd7c5" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.699592 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron936c-account-delete-hq47l" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.712325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0cb7948-9a3d-40fa-b4e7-bb6b765af52f","Type":"ContainerDied","Data":"9b9000366a32ba84dbf78ae5b17fcafb2500b541ed982f02d42d0094ea6b4c80"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.712371 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.737678 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell066dd-account-delete-wnksz"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.738440 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd5468f46-ccf7d" event={"ID":"ca78485a-0626-420e-aa59-a9a4680df83c","Type":"ContainerDied","Data":"d1a52799ef805cf10d4e03ac9720bbab4f9edf303ad6ce305398173baa3512d9"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.738586 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd5468f46-ccf7d" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.746667 4735 scope.go:117] "RemoveContainer" containerID="ef4967d1505370e0cd47289c67232c6f8769a456bdabc3b30bd020b8c2020a1e" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.754739 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance55cc-account-delete-cnr66" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.754789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance55cc-account-delete-cnr66" event={"ID":"f952e88e-fe68-4770-874c-fdee88d43aee","Type":"ContainerDied","Data":"8dbec95c4ce8c28ba5659417c27c67fb59975963e9a422e3a204e96c0b171072"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.757907 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell066dd-account-delete-wnksz"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.766520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8c473e5-0132-4ee7-a14e-ada0780d358c","Type":"ContainerDied","Data":"beb05430b2e4377c90fcb24d6c1b3d571a3fe48e638ced052b15421e516051f6"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.766690 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.771742 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.777625 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.778801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97384a50-9f20-4cbf-8b02-afffe5ddc791","Type":"ContainerDied","Data":"059d46787e8c02976e8aea98ad1dfa228b5445ba55076e35881403183aae18e4"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.781213 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.782800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f6927451-d206-4948-a43e-5b4ee59dddcc","Type":"ContainerDied","Data":"fe3d5c22dfc05728732f5d67d080c8fac6dee5ca6f6a923e86129ac6a4e105a0"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.782817 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.785100 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement364a-account-delete-dg8rs" event={"ID":"6fe2bc1d-7e65-45ba-9d42-cae899859258","Type":"ContainerDied","Data":"dd1f9def7f6ada112889bfee6873367609de38290aa5db020e0c8fdfbe703326"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.785125 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement364a-account-delete-dg8rs" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.795250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8e2103e-dd2c-4e18-b66b-0555a1c8f43a","Type":"ContainerDied","Data":"8f8229032075f44076ad9bdb292a5dd945022eff685ef4b721d0ec3c86908f4e"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.795341 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.796536 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:19:26 crc kubenswrapper[4735]: E1008 13:19:26.799179 4735 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:26 crc kubenswrapper[4735]: E1008 13:19:26.799236 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data podName:01ad2819-cfc3-4416-8a27-ecb65c2fa366 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:34.799220176 +0000 UTC m=+1424.607524720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data") pod "rabbitmq-cell1-server-0" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366") : configmap "rabbitmq-cell1-config-data" not found Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.799922 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1db01523-cd76-4a9d-8063-e024dd919879","Type":"ContainerDied","Data":"9ae6450959e862d9763d81e9fae2191a7b6b4f716035130d03d9f88e6d1c3490"} Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.800021 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.808786 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.911236 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0172640d-d560-4ef0-a7e4-b649ccfa92e4" path="/var/lib/kubelet/pods/0172640d-d560-4ef0-a7e4-b649ccfa92e4/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.912035 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" path="/var/lib/kubelet/pods/2868ea84-fef0-4728-9a62-e2ba97a9aae1/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.913178 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" path="/var/lib/kubelet/pods/2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.915055 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31465d06-bf56-4173-95c1-3c13d776b1f9" path="/var/lib/kubelet/pods/31465d06-bf56-4173-95c1-3c13d776b1f9/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.915769 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4189bc5a-2f23-4b53-9df4-e97a65a7d8eb" path="/var/lib/kubelet/pods/4189bc5a-2f23-4b53-9df4-e97a65a7d8eb/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.916439 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53890bab-61cc-4f19-908a-33c085da8d33" path="/var/lib/kubelet/pods/53890bab-61cc-4f19-908a-33c085da8d33/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.917848 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" path="/var/lib/kubelet/pods/5516bd19-016a-4d52-84c9-b3801e1841ca/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.918487 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" path="/var/lib/kubelet/pods/70450186-9a74-4ba9-92f9-3f3a9626d0a0/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.919079 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84fa39bb-f66a-49ed-960c-7598ec87854a" path="/var/lib/kubelet/pods/84fa39bb-f66a-49ed-960c-7598ec87854a/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.920056 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8953f639-e468-4f85-8093-6c29fa380d8a" path="/var/lib/kubelet/pods/8953f639-e468-4f85-8093-6c29fa380d8a/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.920588 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6e2da8-2dc0-4642-b216-218ed61cc813" path="/var/lib/kubelet/pods/9d6e2da8-2dc0-4642-b216-218ed61cc813/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.921076 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada58119-3f96-4962-b214-fb2662c3d935" path="/var/lib/kubelet/pods/ada58119-3f96-4962-b214-fb2662c3d935/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.922125 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" path="/var/lib/kubelet/pods/c0cb7948-9a3d-40fa-b4e7-bb6b765af52f/volumes" Oct 08 13:19:26 crc kubenswrapper[4735]: I1008 13:19:26.979036 4735 scope.go:117] "RemoveContainer" containerID="d70b201742b9a825e3e7cf33e617c61556b91f285f8bf4e0c3a1ee1d89743db4" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.076769 4735 scope.go:117] "RemoveContainer" containerID="7cbf4814a5d8742756a2f0076ccbdc83fc3186c93b8940d1bc61f10728f5e789" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.113706 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6/ovn-northd/0.log" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.114032 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.164982 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron936c-account-delete-hq47l"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.167768 4735 scope.go:117] "RemoveContainer" containerID="00bb3e3d5baf3ca592174014e03b36040aa0b638cb8a1c2566220b5a3cb65e9b" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.178888 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron936c-account-delete-hq47l"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.191564 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.192832 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.210271 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-config\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.210370 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sshql\" (UniqueName: \"kubernetes.io/projected/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-kube-api-access-sshql\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.210421 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-northd-tls-certs\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.210448 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-rundir\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.210570 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-metrics-certs-tls-certs\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.214661 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-combined-ca-bundle\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.214741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-scripts\") pod \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\" (UID: \"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.212349 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-config" (OuterVolumeSpecName: "config") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.215696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-scripts" (OuterVolumeSpecName: "scripts") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.216177 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.217671 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-kube-api-access-sshql" (OuterVolumeSpecName: "kube-api-access-sshql") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "kube-api-access-sshql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.218733 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.225446 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dd5468f46-ccf7d"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.225632 4735 scope.go:117] "RemoveContainer" containerID="54ae09e0758dbced8f136de91ec860924f9d0a4a1aa8e35c4a62a56d25f7b734" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.236157 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dd5468f46-ccf7d"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.253600 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5574db6b6f-94l7x" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": dial tcp 10.217.0.152:9696: connect: connection refused" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.253834 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.261688 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.265974 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.276563 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.283048 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance55cc-account-delete-cnr66"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.284668 4735 scope.go:117] "RemoveContainer" containerID="719ed4d6e6ca3be4b585ba42337fe3c6bf926775f7cb265506efe244531bac65" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.299733 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.303632 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.308196 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance55cc-account-delete-cnr66"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.313684 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement364a-account-delete-dg8rs"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.313762 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" (UID: "e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316409 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316505 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-generated\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316557 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-operator-scripts\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kbl4\" (UniqueName: \"kubernetes.io/projected/f2428d97-aa22-413f-ab58-5b077fdde95c-kube-api-access-8kbl4\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-kolla-config\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316681 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-combined-ca-bundle\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-secrets\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-default\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.316846 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-galera-tls-certs\") pod \"f2428d97-aa22-413f-ab58-5b077fdde95c\" (UID: \"f2428d97-aa22-413f-ab58-5b077fdde95c\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317199 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317218 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317230 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317242 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317253 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317263 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.317273 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sshql\" (UniqueName: \"kubernetes.io/projected/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6-kube-api-access-sshql\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.318525 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement364a-account-delete-dg8rs"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.319616 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.321791 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.321804 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.321971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.322061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2428d97-aa22-413f-ab58-5b077fdde95c-kube-api-access-8kbl4" (OuterVolumeSpecName: "kube-api-access-8kbl4") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "kube-api-access-8kbl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.327141 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-secrets" (OuterVolumeSpecName: "secrets") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.329534 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.334622 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.337322 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.339609 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.356862 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.357456 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.361611 4735 scope.go:117] "RemoveContainer" containerID="ab9591fb677116af7fae955e0ecd4bc1b2ddd5209f1c6da40664145a2dcac0e5" Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.365876 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.366215 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.366654 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.366929 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.366977 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.368509 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.370572 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:27 crc kubenswrapper[4735]: E1008 13:19:27.370622 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.375841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f2428d97-aa22-413f-ab58-5b077fdde95c" (UID: "f2428d97-aa22-413f-ab58-5b077fdde95c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418277 4735 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418310 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418319 4735 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418328 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418336 4735 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2428d97-aa22-413f-ab58-5b077fdde95c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418360 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418369 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2428d97-aa22-413f-ab58-5b077fdde95c-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418378 4735 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2428d97-aa22-413f-ab58-5b077fdde95c-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.418386 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kbl4\" (UniqueName: \"kubernetes.io/projected/f2428d97-aa22-413f-ab58-5b077fdde95c-kube-api-access-8kbl4\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.447378 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.450728 4735 scope.go:117] "RemoveContainer" containerID="89675ae013264bf40ddb13e38257d27a7ce994542ac6e288c014f3d00f9d0b13" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.477363 4735 scope.go:117] "RemoveContainer" containerID="59fafd30b9822a732b940ca8ee5e9d58f1e7139b22b56b4a8c5b227e38ab23a0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.495751 4735 scope.go:117] "RemoveContainer" containerID="d6ed228dfc5748efb601fafc99f157a758cb829f235af05f716ef363f89fcc14" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.523292 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.547457 4735 scope.go:117] "RemoveContainer" containerID="205355fb6f24a0939888db6199eeaa319bf35bb6b065a98fc683f3a28c8fa854" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.583448 4735 scope.go:117] "RemoveContainer" containerID="9fd6286f631f71ba317c16f921eb7e0bc5df5010bec0980750f899b1396c8b6e" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.624199 4735 scope.go:117] "RemoveContainer" containerID="4a51c20c2a4142a285ce95ca78c731663396a540ba1f9a28337f2f9d05073c41" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.649819 4735 scope.go:117] "RemoveContainer" containerID="ff75cfda3a642f58adcba7bc55060219ea312fa63294148521fb3e2d8da99ab9" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.650401 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.693731 4735 scope.go:117] "RemoveContainer" containerID="418d5ac0c79b3bbb212fb99f2df15049ca6cc912c6272d8bf5adfab9f972e7cf" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726521 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/644f0099-ec9c-42af-bbe8-54fe5b603bb3-erlang-cookie-secret\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/644f0099-ec9c-42af-bbe8-54fe5b603bb3-pod-info\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726613 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-server-conf\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726637 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-plugins\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726660 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-confd\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.726730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.727520 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.729852 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-tls\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.729909 4735 scope.go:117] "RemoveContainer" containerID="6fcbdbcd2a5b227d1b93c5af337c1732e6685ef4e33f54e3bddf01cee27e6906" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.729920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-erlang-cookie\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.730771 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t82v4\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-kube-api-access-t82v4\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.730358 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.730810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-plugins-conf\") pod \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\" (UID: \"644f0099-ec9c-42af-bbe8-54fe5b603bb3\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.731718 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.733187 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.733416 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/644f0099-ec9c-42af-bbe8-54fe5b603bb3-pod-info" (OuterVolumeSpecName: "pod-info") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.733529 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644f0099-ec9c-42af-bbe8-54fe5b603bb3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.733706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739083 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739112 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739123 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739145 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739154 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/644f0099-ec9c-42af-bbe8-54fe5b603bb3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739163 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/644f0099-ec9c-42af-bbe8-54fe5b603bb3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.739171 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.754724 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data" (OuterVolumeSpecName: "config-data") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.756862 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-kube-api-access-t82v4" (OuterVolumeSpecName: "kube-api-access-t82v4") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "kube-api-access-t82v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.766882 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.770972 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-server-conf" (OuterVolumeSpecName: "server-conf") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.799227 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.827225 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6/ovn-northd/0.log" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.827402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6","Type":"ContainerDied","Data":"aacf215356088f4b69f6cb6c70b074703759a1862de3e2ca53addb55f251d2c3"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.827445 4735 scope.go:117] "RemoveContainer" containerID="14207a6e2a9596d194069fc196983081842e33d540bef5d9124e59a802e7d234" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.827575 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840089 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840118 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/644f0099-ec9c-42af-bbe8-54fe5b603bb3-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840127 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t82v4\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-kube-api-access-t82v4\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840137 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840290 4735 generic.go:334] "Generic (PLEG): container finished" podID="477f295c-5e96-4f47-8caa-f7d36b074cdd" containerID="682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f" exitCode=0 Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77f649d57f-mlpp4" event={"ID":"477f295c-5e96-4f47-8caa-f7d36b074cdd","Type":"ContainerDied","Data":"682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-77f649d57f-mlpp4" event={"ID":"477f295c-5e96-4f47-8caa-f7d36b074cdd","Type":"ContainerDied","Data":"62d5911bcbac7b32b1863f013daed4d8870e040d748ea093af68ec1c437e8771"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.840441 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-77f649d57f-mlpp4" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.844634 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "644f0099-ec9c-42af-bbe8-54fe5b603bb3" (UID: "644f0099-ec9c-42af-bbe8-54fe5b603bb3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.853585 4735 generic.go:334] "Generic (PLEG): container finished" podID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerID="d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52" exitCode=0 Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.853659 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2428d97-aa22-413f-ab58-5b077fdde95c","Type":"ContainerDied","Data":"d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.853686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2428d97-aa22-413f-ab58-5b077fdde95c","Type":"ContainerDied","Data":"4c4be7c93ac7be1aaa6951deaf23df23d213e01d8665f74d96003e4f40f780f6"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.853691 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.863964 4735 generic.go:334] "Generic (PLEG): container finished" podID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerID="898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356" exitCode=0 Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.864015 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"644f0099-ec9c-42af-bbe8-54fe5b603bb3","Type":"ContainerDied","Data":"898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.864034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"644f0099-ec9c-42af-bbe8-54fe5b603bb3","Type":"ContainerDied","Data":"25bf65159126ff4da297a51df6ff72a110c317913e73192603856a8dc3503a86"} Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.864081 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.923864 4735 scope.go:117] "RemoveContainer" containerID="af561faafdd478b081c4107185420b4e9d3e3ef31f1f6fc09c0c8a499ec07cbe" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.927489 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.937248 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.942955 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-config-data\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943009 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdz6w\" (UniqueName: \"kubernetes.io/projected/477f295c-5e96-4f47-8caa-f7d36b074cdd-kube-api-access-tdz6w\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-scripts\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-combined-ca-bundle\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943214 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-public-tls-certs\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943247 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-fernet-keys\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943310 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-credential-keys\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943346 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-internal-tls-certs\") pod \"477f295c-5e96-4f47-8caa-f7d36b074cdd\" (UID: \"477f295c-5e96-4f47-8caa-f7d36b074cdd\") " Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943653 4735 scope.go:117] "RemoveContainer" containerID="682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.943742 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/644f0099-ec9c-42af-bbe8-54fe5b603bb3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.947183 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.952364 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.952830 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.956076 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-scripts" (OuterVolumeSpecName: "scripts") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.959005 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477f295c-5e96-4f47-8caa-f7d36b074cdd-kube-api-access-tdz6w" (OuterVolumeSpecName: "kube-api-access-tdz6w") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "kube-api-access-tdz6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.966241 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.968104 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.973967 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.978004 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.983547 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-config-data" (OuterVolumeSpecName: "config-data") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.990315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:27 crc kubenswrapper[4735]: I1008 13:19:27.992314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "477f295c-5e96-4f47-8caa-f7d36b074cdd" (UID: "477f295c-5e96-4f47-8caa-f7d36b074cdd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.044993 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045030 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045051 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045062 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045076 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045114 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045127 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdz6w\" (UniqueName: \"kubernetes.io/projected/477f295c-5e96-4f47-8caa-f7d36b074cdd-kube-api-access-tdz6w\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.045142 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/477f295c-5e96-4f47-8caa-f7d36b074cdd-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.070502 4735 scope.go:117] "RemoveContainer" containerID="682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f" Oct 08 13:19:28 crc kubenswrapper[4735]: E1008 13:19:28.070907 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f\": container with ID starting with 682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f not found: ID does not exist" containerID="682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.071000 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f"} err="failed to get container status \"682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f\": rpc error: code = NotFound desc = could not find container \"682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f\": container with ID starting with 682901b4afd22fdea54fc528f8f21909ffe8ed3d9dfc12a8092cefe00176a51f not found: ID does not exist" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.071091 4735 scope.go:117] "RemoveContainer" containerID="d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.101491 4735 scope.go:117] "RemoveContainer" containerID="57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.127892 4735 scope.go:117] "RemoveContainer" containerID="d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52" Oct 08 13:19:28 crc kubenswrapper[4735]: E1008 13:19:28.128330 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52\": container with ID starting with d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52 not found: ID does not exist" containerID="d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.128366 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52"} err="failed to get container status \"d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52\": rpc error: code = NotFound desc = could not find container \"d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52\": container with ID starting with d79545f041b4f7e6f0e27be8cc9df0ec5c12046d7e3a2b69869c06d06a4d0a52 not found: ID does not exist" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.128392 4735 scope.go:117] "RemoveContainer" containerID="57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746" Oct 08 13:19:28 crc kubenswrapper[4735]: E1008 13:19:28.128689 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746\": container with ID starting with 57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746 not found: ID does not exist" containerID="57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.128813 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746"} err="failed to get container status \"57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746\": rpc error: code = NotFound desc = could not find container \"57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746\": container with ID starting with 57d3f0e874e75cf6c5615fcd3798a1da6cbf40f7346cfb72e851d90feb4d8746 not found: ID does not exist" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.128898 4735 scope.go:117] "RemoveContainer" containerID="898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.146965 4735 scope.go:117] "RemoveContainer" containerID="08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.175624 4735 scope.go:117] "RemoveContainer" containerID="898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356" Oct 08 13:19:28 crc kubenswrapper[4735]: E1008 13:19:28.176124 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356\": container with ID starting with 898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356 not found: ID does not exist" containerID="898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.176167 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356"} err="failed to get container status \"898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356\": rpc error: code = NotFound desc = could not find container \"898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356\": container with ID starting with 898693adc5b91bf624ecec806c4b7f6fbc4f40eb308ffd7f57a26fe3a69b9356 not found: ID does not exist" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.176192 4735 scope.go:117] "RemoveContainer" containerID="08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0" Oct 08 13:19:28 crc kubenswrapper[4735]: E1008 13:19:28.176528 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0\": container with ID starting with 08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0 not found: ID does not exist" containerID="08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.176547 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0"} err="failed to get container status \"08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0\": rpc error: code = NotFound desc = could not find container \"08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0\": container with ID starting with 08e5f43c73b04b459407c1ed506e3e085061ed6f53fe97fad46fc15c5a943ee0 not found: ID does not exist" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.177641 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-77f649d57f-mlpp4"] Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.184720 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-77f649d57f-mlpp4"] Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.544332 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.922422 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db01523-cd76-4a9d-8063-e024dd919879" path="/var/lib/kubelet/pods/1db01523-cd76-4a9d-8063-e024dd919879/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.922982 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477f295c-5e96-4f47-8caa-f7d36b074cdd" path="/var/lib/kubelet/pods/477f295c-5e96-4f47-8caa-f7d36b074cdd/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.923629 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" path="/var/lib/kubelet/pods/644f0099-ec9c-42af-bbe8-54fe5b603bb3/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.927142 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.929943 4735 generic.go:334] "Generic (PLEG): container finished" podID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerID="18cff9aa331d4910576ccc409a19da98fba4e35b7118162bd1347fc1d99cf4af" exitCode=0 Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.930145 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe2bc1d-7e65-45ba-9d42-cae899859258" path="/var/lib/kubelet/pods/6fe2bc1d-7e65-45ba-9d42-cae899859258/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.930669 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" path="/var/lib/kubelet/pods/97384a50-9f20-4cbf-8b02-afffe5ddc791/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.931261 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" path="/var/lib/kubelet/pods/b8c473e5-0132-4ee7-a14e-ada0780d358c/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.932392 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" path="/var/lib/kubelet/pods/b8e2103e-dd2c-4e18-b66b-0555a1c8f43a/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.934094 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" path="/var/lib/kubelet/pods/ca78485a-0626-420e-aa59-a9a4680df83c/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.934686 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" path="/var/lib/kubelet/pods/e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.938244 4735 generic.go:334] "Generic (PLEG): container finished" podID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerID="9f5626323fe1058e9c1fda7509e11efd3640670cee71f3ea2717d062bae876cf" exitCode=0 Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.941209 4735 generic.go:334] "Generic (PLEG): container finished" podID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerID="b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a" exitCode=0 Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.941288 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.943208 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" path="/var/lib/kubelet/pods/f2428d97-aa22-413f-ab58-5b077fdde95c/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.943802 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6927451-d206-4948-a43e-5b4ee59dddcc" path="/var/lib/kubelet/pods/f6927451-d206-4948-a43e-5b4ee59dddcc/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.944211 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92cd291-9315-4b44-b318-6f029f253de6" path="/var/lib/kubelet/pods/f92cd291-9315-4b44-b318-6f029f253de6/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.945777 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f952e88e-fe68-4770-874c-fdee88d43aee" path="/var/lib/kubelet/pods/f952e88e-fe68-4770-874c-fdee88d43aee/volumes" Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.946219 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ad2819-cfc3-4416-8a27-ecb65c2fa366","Type":"ContainerDied","Data":"18cff9aa331d4910576ccc409a19da98fba4e35b7118162bd1347fc1d99cf4af"} Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.946250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" event={"ID":"9b111975-076c-4c7f-bcbd-ecef0b62936f","Type":"ContainerDied","Data":"9f5626323fe1058e9c1fda7509e11efd3640670cee71f3ea2717d062bae876cf"} Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.946267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerDied","Data":"b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a"} Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.946288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b1303e4-c7b5-456a-aba6-c4674b8bcfed","Type":"ContainerDied","Data":"7668d77282c181473f5eb368ac24c8d61b06b230a0b7cf3fc2ba2ce2e0efa3dc"} Oct 08 13:19:28 crc kubenswrapper[4735]: I1008 13:19:28.946308 4735 scope.go:117] "RemoveContainer" containerID="ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.018906 4735 scope.go:117] "RemoveContainer" containerID="8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061429 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-sg-core-conf-yaml\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061547 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52dqk\" (UniqueName: \"kubernetes.io/projected/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-kube-api-access-52dqk\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061669 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-log-httpd\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061712 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-ceilometer-tls-certs\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-combined-ca-bundle\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-run-httpd\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061798 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-scripts\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.061828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-config-data\") pod \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\" (UID: \"3b1303e4-c7b5-456a-aba6-c4674b8bcfed\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.083629 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-kube-api-access-52dqk" (OuterVolumeSpecName: "kube-api-access-52dqk") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "kube-api-access-52dqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.084028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.084872 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.085060 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-scripts" (OuterVolumeSpecName: "scripts") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.163159 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52dqk\" (UniqueName: \"kubernetes.io/projected/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-kube-api-access-52dqk\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.163186 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.163195 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.163203 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.163514 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.165857 4735 scope.go:117] "RemoveContainer" containerID="b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.200793 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.206796 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.207033 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.244045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-config-data" (OuterVolumeSpecName: "config-data") pod "3b1303e4-c7b5-456a-aba6-c4674b8bcfed" (UID: "3b1303e4-c7b5-456a-aba6-c4674b8bcfed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.244356 4735 scope.go:117] "RemoveContainer" containerID="0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.264376 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b111975-076c-4c7f-bcbd-ecef0b62936f-logs\") pod \"9b111975-076c-4c7f-bcbd-ecef0b62936f\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.264431 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfj9r\" (UniqueName: \"kubernetes.io/projected/9b111975-076c-4c7f-bcbd-ecef0b62936f-kube-api-access-kfj9r\") pod \"9b111975-076c-4c7f-bcbd-ecef0b62936f\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.264991 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data\") pod \"9b111975-076c-4c7f-bcbd-ecef0b62936f\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265062 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data-custom\") pod \"9b111975-076c-4c7f-bcbd-ecef0b62936f\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-combined-ca-bundle\") pod \"9b111975-076c-4c7f-bcbd-ecef0b62936f\" (UID: \"9b111975-076c-4c7f-bcbd-ecef0b62936f\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b111975-076c-4c7f-bcbd-ecef0b62936f-logs" (OuterVolumeSpecName: "logs") pod "9b111975-076c-4c7f-bcbd-ecef0b62936f" (UID: "9b111975-076c-4c7f-bcbd-ecef0b62936f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265548 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b111975-076c-4c7f-bcbd-ecef0b62936f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265564 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265577 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265586 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.265594 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b1303e4-c7b5-456a-aba6-c4674b8bcfed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.268365 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b111975-076c-4c7f-bcbd-ecef0b62936f-kube-api-access-kfj9r" (OuterVolumeSpecName: "kube-api-access-kfj9r") pod "9b111975-076c-4c7f-bcbd-ecef0b62936f" (UID: "9b111975-076c-4c7f-bcbd-ecef0b62936f"). InnerVolumeSpecName "kube-api-access-kfj9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.274573 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b111975-076c-4c7f-bcbd-ecef0b62936f" (UID: "9b111975-076c-4c7f-bcbd-ecef0b62936f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.285726 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b111975-076c-4c7f-bcbd-ecef0b62936f" (UID: "9b111975-076c-4c7f-bcbd-ecef0b62936f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.297148 4735 scope.go:117] "RemoveContainer" containerID="ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523" Oct 08 13:19:29 crc kubenswrapper[4735]: E1008 13:19:29.299796 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523\": container with ID starting with ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523 not found: ID does not exist" containerID="ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.299968 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523"} err="failed to get container status \"ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523\": rpc error: code = NotFound desc = could not find container \"ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523\": container with ID starting with ca8e40a239770c1ac13452e47ffb1eb94e1acfb51391b3e3bf143cf183113523 not found: ID does not exist" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.300087 4735 scope.go:117] "RemoveContainer" containerID="8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf" Oct 08 13:19:29 crc kubenswrapper[4735]: E1008 13:19:29.300575 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf\": container with ID starting with 8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf not found: ID does not exist" containerID="8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.300673 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf"} err="failed to get container status \"8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf\": rpc error: code = NotFound desc = could not find container \"8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf\": container with ID starting with 8f014700eb4b62d55701fa790058c2874eeee602afa6352080d545ba869c85bf not found: ID does not exist" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.300795 4735 scope.go:117] "RemoveContainer" containerID="b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a" Oct 08 13:19:29 crc kubenswrapper[4735]: E1008 13:19:29.301281 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a\": container with ID starting with b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a not found: ID does not exist" containerID="b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.301320 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a"} err="failed to get container status \"b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a\": rpc error: code = NotFound desc = could not find container \"b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a\": container with ID starting with b645519406061dad8c7f4d39599dd4ab3d8adc1c94d6ca4f85885caf6312b12a not found: ID does not exist" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.301348 4735 scope.go:117] "RemoveContainer" containerID="0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8" Oct 08 13:19:29 crc kubenswrapper[4735]: E1008 13:19:29.301738 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8\": container with ID starting with 0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8 not found: ID does not exist" containerID="0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.302187 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8"} err="failed to get container status \"0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8\": rpc error: code = NotFound desc = could not find container \"0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8\": container with ID starting with 0817e0e9a63d415352253a5caafcd20e3b3968eec1f9e4e4f0c7f66ba7ae9ff8 not found: ID does not exist" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.303007 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data" (OuterVolumeSpecName: "config-data") pod "9b111975-076c-4c7f-bcbd-ecef0b62936f" (UID: "9b111975-076c-4c7f-bcbd-ecef0b62936f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.320726 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.367079 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfj9r\" (UniqueName: \"kubernetes.io/projected/9b111975-076c-4c7f-bcbd-ecef0b62936f-kube-api-access-kfj9r\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.367118 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.367131 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.367143 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b111975-076c-4c7f-bcbd-ecef0b62936f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468489 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-tls\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-plugins-conf\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-erlang-cookie\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468655 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468677 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-plugins\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468727 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkx52\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-kube-api-access-bkx52\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468769 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ad2819-cfc3-4416-8a27-ecb65c2fa366-pod-info\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468802 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-server-conf\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468832 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ad2819-cfc3-4416-8a27-ecb65c2fa366-erlang-cookie-secret\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.468870 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-confd\") pod \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\" (UID: \"01ad2819-cfc3-4416-8a27-ecb65c2fa366\") " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.470318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.470684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.470784 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.481676 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.481675 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/01ad2819-cfc3-4416-8a27-ecb65c2fa366-pod-info" (OuterVolumeSpecName: "pod-info") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.481675 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ad2819-cfc3-4416-8a27-ecb65c2fa366-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.481736 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-kube-api-access-bkx52" (OuterVolumeSpecName: "kube-api-access-bkx52") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "kube-api-access-bkx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.481772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.495908 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data" (OuterVolumeSpecName: "config-data") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.522537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-server-conf" (OuterVolumeSpecName: "server-conf") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.562195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "01ad2819-cfc3-4416-8a27-ecb65c2fa366" (UID: "01ad2819-cfc3-4416-8a27-ecb65c2fa366"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570121 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570141 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570686 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570714 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570747 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkx52\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-kube-api-access-bkx52\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570759 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ad2819-cfc3-4416-8a27-ecb65c2fa366-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570770 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570778 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ad2819-cfc3-4416-8a27-ecb65c2fa366-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570787 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570794 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ad2819-cfc3-4416-8a27-ecb65c2fa366-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.570802 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ad2819-cfc3-4416-8a27-ecb65c2fa366-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.584927 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.592822 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.600058 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.672439 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.983735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ad2819-cfc3-4416-8a27-ecb65c2fa366","Type":"ContainerDied","Data":"b9112f5686bf87b44a22cb2a26b7ecaabb68baa979c8ffde3f8994321551ccf2"} Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.983786 4735 scope.go:117] "RemoveContainer" containerID="18cff9aa331d4910576ccc409a19da98fba4e35b7118162bd1347fc1d99cf4af" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.983959 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.987627 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" Oct 08 13:19:29 crc kubenswrapper[4735]: I1008 13:19:29.989003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69dd69556d-pvx5r" event={"ID":"9b111975-076c-4c7f-bcbd-ecef0b62936f","Type":"ContainerDied","Data":"f112d9a58052e441123ebba81af5d1f30ab0a8da3f60511d56153347d6a0720c"} Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.012409 4735 scope.go:117] "RemoveContainer" containerID="8b5489cc563b906768d49040d0dc9d8ee683e0a1a8c2ad6e11328dcf2b18d053" Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.033337 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.050666 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.067393 4735 scope.go:117] "RemoveContainer" containerID="9f5626323fe1058e9c1fda7509e11efd3640670cee71f3ea2717d062bae876cf" Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.068201 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-69dd69556d-pvx5r"] Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.076934 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-69dd69556d-pvx5r"] Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.088799 4735 scope.go:117] "RemoveContainer" containerID="d5b88355cbab2ef2c4641f67a06c04f010a0ffe1ebf2da8598d0977f7187066a" Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.907665 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" path="/var/lib/kubelet/pods/01ad2819-cfc3-4416-8a27-ecb65c2fa366/volumes" Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.909343 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" path="/var/lib/kubelet/pods/3b1303e4-c7b5-456a-aba6-c4674b8bcfed/volumes" Oct 08 13:19:30 crc kubenswrapper[4735]: I1008 13:19:30.910794 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" path="/var/lib/kubelet/pods/9b111975-076c-4c7f-bcbd-ecef0b62936f/volumes" Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.018246 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.171:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.755291 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933076 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-config\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933135 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-combined-ca-bundle\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933211 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-public-tls-certs\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933241 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-internal-tls-certs\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933272 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjfqt\" (UniqueName: \"kubernetes.io/projected/74ecad81-3dfd-4f36-be79-699b6bd30e90-kube-api-access-rjfqt\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.933303 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-httpd-config\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.949647 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.949680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ecad81-3dfd-4f36-be79-699b6bd30e90-kube-api-access-rjfqt" (OuterVolumeSpecName: "kube-api-access-rjfqt") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "kube-api-access-rjfqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.980627 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:31 crc kubenswrapper[4735]: I1008 13:19:31.988812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.008664 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-config" (OuterVolumeSpecName: "config") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.032760 4735 generic.go:334] "Generic (PLEG): container finished" podID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerID="640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543" exitCode=0 Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.032810 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574db6b6f-94l7x" event={"ID":"74ecad81-3dfd-4f36-be79-699b6bd30e90","Type":"ContainerDied","Data":"640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543"} Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.032841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574db6b6f-94l7x" event={"ID":"74ecad81-3dfd-4f36-be79-699b6bd30e90","Type":"ContainerDied","Data":"3ab3e7a6b765a6de193c1b1dea7e1331e5f6be760afeb37f0a1e52c4bc595d44"} Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.032862 4735 scope.go:117] "RemoveContainer" containerID="b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.032966 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574db6b6f-94l7x" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.035302 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.035331 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-config\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.035344 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.035359 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.035370 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjfqt\" (UniqueName: \"kubernetes.io/projected/74ecad81-3dfd-4f36-be79-699b6bd30e90-kube-api-access-rjfqt\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.066774 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs podName:74ecad81-3dfd-4f36-be79-699b6bd30e90 nodeName:}" failed. No retries permitted until 2025-10-08 13:19:32.566739976 +0000 UTC m=+1422.375044520 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90") : error deleting /var/lib/kubelet/pods/74ecad81-3dfd-4f36-be79-699b6bd30e90/volume-subpaths: remove /var/lib/kubelet/pods/74ecad81-3dfd-4f36-be79-699b6bd30e90/volume-subpaths: no such file or directory Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.072622 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.076951 4735 scope.go:117] "RemoveContainer" containerID="640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.112658 4735 scope.go:117] "RemoveContainer" containerID="b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d" Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.113205 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d\": container with ID starting with b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d not found: ID does not exist" containerID="b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.113240 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d"} err="failed to get container status \"b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d\": rpc error: code = NotFound desc = could not find container \"b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d\": container with ID starting with b2ed1987ab45001110bf36a72fc3eb614ca1c3f8a88b1bff605a3d0de79aba7d not found: ID does not exist" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.113263 4735 scope.go:117] "RemoveContainer" containerID="640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543" Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.113696 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543\": container with ID starting with 640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543 not found: ID does not exist" containerID="640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.113740 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543"} err="failed to get container status \"640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543\": rpc error: code = NotFound desc = could not find container \"640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543\": container with ID starting with 640b5b25fef4ad0714f6e184d4c2f8a8cea54a0bf1b5cd1e9e4357dfb3d2f543 not found: ID does not exist" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.136507 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.363733 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.364126 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.364513 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.364548 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.365143 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.368469 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.370197 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:32 crc kubenswrapper[4735]: E1008 13:19:32.370259 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.643840 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs\") pod \"74ecad81-3dfd-4f36-be79-699b6bd30e90\" (UID: \"74ecad81-3dfd-4f36-be79-699b6bd30e90\") " Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.647089 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "74ecad81-3dfd-4f36-be79-699b6bd30e90" (UID: "74ecad81-3dfd-4f36-be79-699b6bd30e90"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.745068 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ecad81-3dfd-4f36-be79-699b6bd30e90-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.954781 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5574db6b6f-94l7x"] Oct 08 13:19:32 crc kubenswrapper[4735]: I1008 13:19:32.960775 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5574db6b6f-94l7x"] Oct 08 13:19:34 crc kubenswrapper[4735]: I1008 13:19:34.903774 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" path="/var/lib/kubelet/pods/74ecad81-3dfd-4f36-be79-699b6bd30e90/volumes" Oct 08 13:19:35 crc kubenswrapper[4735]: I1008 13:19:35.919991 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-scqw7" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" probeResult="failure" output=< Oct 08 13:19:35 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 13:19:35 crc kubenswrapper[4735]: > Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.364198 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.365217 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.365229 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.365567 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.365638 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.367817 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.369219 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:37 crc kubenswrapper[4735]: E1008 13:19:37.369264 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.363733 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.365620 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.365961 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.366011 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.366766 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.368983 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.371810 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:42 crc kubenswrapper[4735]: E1008 13:19:42.371903 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:19:44 crc kubenswrapper[4735]: I1008 13:19:44.929952 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:44 crc kubenswrapper[4735]: I1008 13:19:44.975551 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:45 crc kubenswrapper[4735]: I1008 13:19:45.167367 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scqw7"] Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.150876 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-scqw7" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" containerID="cri-o://efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801" gracePeriod=2 Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.583857 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.755837 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljj5\" (UniqueName: \"kubernetes.io/projected/983bfc3d-43e8-4190-854d-c3e9497754ae-kube-api-access-nljj5\") pod \"983bfc3d-43e8-4190-854d-c3e9497754ae\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.756429 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-utilities\") pod \"983bfc3d-43e8-4190-854d-c3e9497754ae\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.756610 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-catalog-content\") pod \"983bfc3d-43e8-4190-854d-c3e9497754ae\" (UID: \"983bfc3d-43e8-4190-854d-c3e9497754ae\") " Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.757122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-utilities" (OuterVolumeSpecName: "utilities") pod "983bfc3d-43e8-4190-854d-c3e9497754ae" (UID: "983bfc3d-43e8-4190-854d-c3e9497754ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.765934 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/983bfc3d-43e8-4190-854d-c3e9497754ae-kube-api-access-nljj5" (OuterVolumeSpecName: "kube-api-access-nljj5") pod "983bfc3d-43e8-4190-854d-c3e9497754ae" (UID: "983bfc3d-43e8-4190-854d-c3e9497754ae"). InnerVolumeSpecName "kube-api-access-nljj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.837552 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "983bfc3d-43e8-4190-854d-c3e9497754ae" (UID: "983bfc3d-43e8-4190-854d-c3e9497754ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.858412 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nljj5\" (UniqueName: \"kubernetes.io/projected/983bfc3d-43e8-4190-854d-c3e9497754ae-kube-api-access-nljj5\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.858469 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:46 crc kubenswrapper[4735]: I1008 13:19:46.858510 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/983bfc3d-43e8-4190-854d-c3e9497754ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.161110 4735 generic.go:334] "Generic (PLEG): container finished" podID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerID="efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801" exitCode=0 Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.161151 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerDied","Data":"efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801"} Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.161178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scqw7" event={"ID":"983bfc3d-43e8-4190-854d-c3e9497754ae","Type":"ContainerDied","Data":"83bbfe9e7bb0bd72ec8c4cc7813c747f2b038c3e4781ed72b0ba4adfbe289405"} Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.161196 4735 scope.go:117] "RemoveContainer" containerID="efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.161245 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scqw7" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.189316 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scqw7"] Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.189694 4735 scope.go:117] "RemoveContainer" containerID="48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.196056 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-scqw7"] Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.217672 4735 scope.go:117] "RemoveContainer" containerID="77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.239434 4735 scope.go:117] "RemoveContainer" containerID="efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801" Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.240052 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801\": container with ID starting with efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801 not found: ID does not exist" containerID="efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.240106 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801"} err="failed to get container status \"efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801\": rpc error: code = NotFound desc = could not find container \"efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801\": container with ID starting with efbbe5f4d7ef8e2079f8546a7a5af73ca3e07e7637c7ad3b860d2bc2f827b801 not found: ID does not exist" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.240133 4735 scope.go:117] "RemoveContainer" containerID="48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0" Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.240655 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0\": container with ID starting with 48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0 not found: ID does not exist" containerID="48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.240696 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0"} err="failed to get container status \"48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0\": rpc error: code = NotFound desc = could not find container \"48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0\": container with ID starting with 48d56de238912e892289f50612c011e3423cc86b135a216f8b7f8dad26d525d0 not found: ID does not exist" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.240722 4735 scope.go:117] "RemoveContainer" containerID="77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf" Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.240985 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf\": container with ID starting with 77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf not found: ID does not exist" containerID="77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf" Oct 08 13:19:47 crc kubenswrapper[4735]: I1008 13:19:47.241016 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf"} err="failed to get container status \"77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf\": rpc error: code = NotFound desc = could not find container \"77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf\": container with ID starting with 77dd9834a2ce81f7f56ee522b42602b20ac5813d7d59e4b6b4b162e4aaea8dcf not found: ID does not exist" Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.363108 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.363529 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.363922 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.363957 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.365063 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.368175 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.371667 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 13:19:47 crc kubenswrapper[4735]: E1008 13:19:47.371749 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9frwj" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:19:48 crc kubenswrapper[4735]: I1008 13:19:48.904598 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" path="/var/lib/kubelet/pods/983bfc3d-43e8-4190-854d-c3e9497754ae/volumes" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.200972 4735 generic.go:334] "Generic (PLEG): container finished" podID="22f4e804-ab3e-4858-87b4-84a310c21249" containerID="f976efe16a474e89b33e73cf876bd99fd72a4f29125be28a2094105456065a00" exitCode=137 Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.201078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22f4e804-ab3e-4858-87b4-84a310c21249","Type":"ContainerDied","Data":"f976efe16a474e89b33e73cf876bd99fd72a4f29125be28a2094105456065a00"} Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.569017 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.620566 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-scripts\") pod \"22f4e804-ab3e-4858-87b4-84a310c21249\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.620634 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-combined-ca-bundle\") pod \"22f4e804-ab3e-4858-87b4-84a310c21249\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.620663 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22f4e804-ab3e-4858-87b4-84a310c21249-etc-machine-id\") pod \"22f4e804-ab3e-4858-87b4-84a310c21249\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.620702 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d27nt\" (UniqueName: \"kubernetes.io/projected/22f4e804-ab3e-4858-87b4-84a310c21249-kube-api-access-d27nt\") pod \"22f4e804-ab3e-4858-87b4-84a310c21249\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.620730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data\") pod \"22f4e804-ab3e-4858-87b4-84a310c21249\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.620763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data-custom\") pod \"22f4e804-ab3e-4858-87b4-84a310c21249\" (UID: \"22f4e804-ab3e-4858-87b4-84a310c21249\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.621568 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22f4e804-ab3e-4858-87b4-84a310c21249-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "22f4e804-ab3e-4858-87b4-84a310c21249" (UID: "22f4e804-ab3e-4858-87b4-84a310c21249"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.628698 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-scripts" (OuterVolumeSpecName: "scripts") pod "22f4e804-ab3e-4858-87b4-84a310c21249" (UID: "22f4e804-ab3e-4858-87b4-84a310c21249"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.629656 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f4e804-ab3e-4858-87b4-84a310c21249-kube-api-access-d27nt" (OuterVolumeSpecName: "kube-api-access-d27nt") pod "22f4e804-ab3e-4858-87b4-84a310c21249" (UID: "22f4e804-ab3e-4858-87b4-84a310c21249"). InnerVolumeSpecName "kube-api-access-d27nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.644777 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "22f4e804-ab3e-4858-87b4-84a310c21249" (UID: "22f4e804-ab3e-4858-87b4-84a310c21249"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.669754 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22f4e804-ab3e-4858-87b4-84a310c21249" (UID: "22f4e804-ab3e-4858-87b4-84a310c21249"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.686770 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9frwj_b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb/ovs-vswitchd/0.log" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.688131 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.721923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-run\") pod \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722026 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-etc-ovs\") pod \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722066 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-run" (OuterVolumeSpecName: "var-run") pod "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" (UID: "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722092 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-log\") pod \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722127 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" (UID: "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722168 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq95g\" (UniqueName: \"kubernetes.io/projected/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-kube-api-access-nq95g\") pod \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722189 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-lib\") pod \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722221 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-log" (OuterVolumeSpecName: "var-log") pod "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" (UID: "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722265 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-scripts\") pod \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\" (UID: \"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722552 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722596 4735 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722609 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722636 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22f4e804-ab3e-4858-87b4-84a310c21249-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722669 4735 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722680 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d27nt\" (UniqueName: \"kubernetes.io/projected/22f4e804-ab3e-4858-87b4-84a310c21249-kube-api-access-d27nt\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722690 4735 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-log\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722701 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.722968 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-lib" (OuterVolumeSpecName: "var-lib") pod "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" (UID: "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.724314 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-scripts" (OuterVolumeSpecName: "scripts") pod "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" (UID: "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.726264 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-kube-api-access-nq95g" (OuterVolumeSpecName: "kube-api-access-nq95g") pod "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" (UID: "b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb"). InnerVolumeSpecName "kube-api-access-nq95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.727363 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data" (OuterVolumeSpecName: "config-data") pod "22f4e804-ab3e-4858-87b4-84a310c21249" (UID: "22f4e804-ab3e-4858-87b4-84a310c21249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.785353 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.823571 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.823600 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f4e804-ab3e-4858-87b4-84a310c21249-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.823609 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq95g\" (UniqueName: \"kubernetes.io/projected/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-kube-api-access-nq95g\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.823618 4735 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb-var-lib\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b767n\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-kube-api-access-b767n\") pod \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924088 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") pod \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924177 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-cache\") pod \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924214 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-lock\") pod \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\" (UID: \"dc4e9066-e538-4d60-9c5a-3beb689d67a4\") " Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924858 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-cache" (OuterVolumeSpecName: "cache") pod "dc4e9066-e538-4d60-9c5a-3beb689d67a4" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.924866 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-lock" (OuterVolumeSpecName: "lock") pod "dc4e9066-e538-4d60-9c5a-3beb689d67a4" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.927493 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dc4e9066-e538-4d60-9c5a-3beb689d67a4" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.927539 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "dc4e9066-e538-4d60-9c5a-3beb689d67a4" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 13:19:50 crc kubenswrapper[4735]: I1008 13:19:50.927736 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-kube-api-access-b767n" (OuterVolumeSpecName: "kube-api-access-b767n") pod "dc4e9066-e538-4d60-9c5a-3beb689d67a4" (UID: "dc4e9066-e538-4d60-9c5a-3beb689d67a4"). InnerVolumeSpecName "kube-api-access-b767n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.028007 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b767n\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-kube-api-access-b767n\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.028082 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.028097 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc4e9066-e538-4d60-9c5a-3beb689d67a4-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.028109 4735 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-cache\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.028123 4735 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc4e9066-e538-4d60-9c5a-3beb689d67a4-lock\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.043297 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.130010 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.214320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22f4e804-ab3e-4858-87b4-84a310c21249","Type":"ContainerDied","Data":"9bd4613a433c56bf34cefa6d3b8d95de96342f543309ed08011b6cc7148e8c82"} Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.214381 4735 scope.go:117] "RemoveContainer" containerID="b7bc2b861ffd2f1d3a1ede5b87a767f230be0eb8b6d7f935ee21e11963a9f933" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.214387 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.225269 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerID="a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd" exitCode=137 Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.225358 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd"} Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.225384 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc4e9066-e538-4d60-9c5a-3beb689d67a4","Type":"ContainerDied","Data":"79c4d8dddaa9cd10fa75a654df1c84c66f8a311d7a0d6eb6b62b00c0d47c8031"} Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.225510 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.236097 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.238282 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9frwj_b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb/ovs-vswitchd/0.log" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.240505 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" exitCode=137 Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.240551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerDied","Data":"dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773"} Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.240581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9frwj" event={"ID":"b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb","Type":"ContainerDied","Data":"f22c7151017feeaffeab02436b73af72e45d237482cf4e79a60f12d816e2c1a8"} Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.240847 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9frwj" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.248133 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.252437 4735 scope.go:117] "RemoveContainer" containerID="f976efe16a474e89b33e73cf876bd99fd72a4f29125be28a2094105456065a00" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.268851 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9frwj"] Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.276155 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-9frwj"] Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.280843 4735 scope.go:117] "RemoveContainer" containerID="a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.288092 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.298586 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.304620 4735 scope.go:117] "RemoveContainer" containerID="dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.321277 4735 scope.go:117] "RemoveContainer" containerID="4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.343019 4735 scope.go:117] "RemoveContainer" containerID="d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.364416 4735 scope.go:117] "RemoveContainer" containerID="237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.385406 4735 scope.go:117] "RemoveContainer" containerID="2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.401596 4735 scope.go:117] "RemoveContainer" containerID="309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.420137 4735 scope.go:117] "RemoveContainer" containerID="f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.444115 4735 scope.go:117] "RemoveContainer" containerID="1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.466140 4735 scope.go:117] "RemoveContainer" containerID="0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.487779 4735 scope.go:117] "RemoveContainer" containerID="33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.505662 4735 scope.go:117] "RemoveContainer" containerID="318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.526336 4735 scope.go:117] "RemoveContainer" containerID="d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.551175 4735 scope.go:117] "RemoveContainer" containerID="f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.573414 4735 scope.go:117] "RemoveContainer" containerID="fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.596391 4735 scope.go:117] "RemoveContainer" containerID="a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.596671 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd\": container with ID starting with a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd not found: ID does not exist" containerID="a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.596703 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd"} err="failed to get container status \"a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd\": rpc error: code = NotFound desc = could not find container \"a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd\": container with ID starting with a8268e5dce5d4e32fa0bca1c883910824568df9f3952c6f7d9f188760642a5cd not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.596722 4735 scope.go:117] "RemoveContainer" containerID="dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.596970 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b\": container with ID starting with dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b not found: ID does not exist" containerID="dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.596997 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b"} err="failed to get container status \"dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b\": rpc error: code = NotFound desc = could not find container \"dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b\": container with ID starting with dff90633bc3ad4ba06aede8cdacdcc6e496b31b0aade72c08da50cac8d87fc0b not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.597013 4735 scope.go:117] "RemoveContainer" containerID="4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.597213 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9\": container with ID starting with 4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9 not found: ID does not exist" containerID="4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.597232 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9"} err="failed to get container status \"4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9\": rpc error: code = NotFound desc = could not find container \"4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9\": container with ID starting with 4ee9082bf58fd24e367fd1da09c29e111946f4c4faa922b9579b326f978119c9 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.597247 4735 scope.go:117] "RemoveContainer" containerID="d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.597867 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4\": container with ID starting with d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4 not found: ID does not exist" containerID="d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.597893 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4"} err="failed to get container status \"d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4\": rpc error: code = NotFound desc = could not find container \"d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4\": container with ID starting with d5db6b982ae606222a36f5fcc188c4abda346b0f7655cfdfa4f6f7337c2397f4 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.597909 4735 scope.go:117] "RemoveContainer" containerID="237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.598127 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff\": container with ID starting with 237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff not found: ID does not exist" containerID="237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.598153 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff"} err="failed to get container status \"237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff\": rpc error: code = NotFound desc = could not find container \"237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff\": container with ID starting with 237f4d6c051f7223b102aeaedaa673ab069d0f15f6817e4945c35de0b812fbff not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.598169 4735 scope.go:117] "RemoveContainer" containerID="2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.598430 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97\": container with ID starting with 2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97 not found: ID does not exist" containerID="2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.598455 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97"} err="failed to get container status \"2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97\": rpc error: code = NotFound desc = could not find container \"2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97\": container with ID starting with 2f40ca43ae662cc47bd56ee824d503ccec2b5ae9bf9c20513e8ce20464d2dd97 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.598470 4735 scope.go:117] "RemoveContainer" containerID="309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.598790 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b\": container with ID starting with 309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b not found: ID does not exist" containerID="309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.598818 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b"} err="failed to get container status \"309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b\": rpc error: code = NotFound desc = could not find container \"309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b\": container with ID starting with 309e0783fbd98cb5e4632f3d4d0e2f6a9b58f92ad92a2d4004eaa1a6d7fae30b not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.598836 4735 scope.go:117] "RemoveContainer" containerID="f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.599068 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c\": container with ID starting with f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c not found: ID does not exist" containerID="f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599094 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c"} err="failed to get container status \"f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c\": rpc error: code = NotFound desc = could not find container \"f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c\": container with ID starting with f439ff20d2384d40dd148b0bc28e8bfd91c3d44ddcc97d2ceffaba6872bf825c not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599112 4735 scope.go:117] "RemoveContainer" containerID="1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.599284 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124\": container with ID starting with 1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124 not found: ID does not exist" containerID="1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599308 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124"} err="failed to get container status \"1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124\": rpc error: code = NotFound desc = could not find container \"1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124\": container with ID starting with 1e668315fdf2d5cc1e4b746212c9eadc171a4f54451b10f098411736e896e124 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599323 4735 scope.go:117] "RemoveContainer" containerID="0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.599507 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9\": container with ID starting with 0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9 not found: ID does not exist" containerID="0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599533 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9"} err="failed to get container status \"0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9\": rpc error: code = NotFound desc = could not find container \"0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9\": container with ID starting with 0b38bdb53ab1820c96e395176b2856e8ca2eb0cf0636872f34d1914a860cf3a9 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599549 4735 scope.go:117] "RemoveContainer" containerID="33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.599793 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4\": container with ID starting with 33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4 not found: ID does not exist" containerID="33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599818 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4"} err="failed to get container status \"33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4\": rpc error: code = NotFound desc = could not find container \"33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4\": container with ID starting with 33f430e8921d6f23db0786f1f9627335fe796e31b2a0bd9605c6ebfc3decb0c4 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.599879 4735 scope.go:117] "RemoveContainer" containerID="318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.600095 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c\": container with ID starting with 318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c not found: ID does not exist" containerID="318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600121 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c"} err="failed to get container status \"318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c\": rpc error: code = NotFound desc = could not find container \"318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c\": container with ID starting with 318bd8def163d2bed4ab842bc210c31f81ec6fcdd4d540b67f27fd1cef63ad4c not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600137 4735 scope.go:117] "RemoveContainer" containerID="d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.600341 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d\": container with ID starting with d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d not found: ID does not exist" containerID="d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600368 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d"} err="failed to get container status \"d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d\": rpc error: code = NotFound desc = could not find container \"d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d\": container with ID starting with d853b3073957b18a6b510a37923cf71788139823a9c69e0f74f142983721b87d not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600385 4735 scope.go:117] "RemoveContainer" containerID="f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.600590 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2\": container with ID starting with f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2 not found: ID does not exist" containerID="f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600614 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2"} err="failed to get container status \"f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2\": rpc error: code = NotFound desc = could not find container \"f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2\": container with ID starting with f07ef2651928239139875a266d70f61a17120a8e70c7d971632cdd042dda58f2 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600629 4735 scope.go:117] "RemoveContainer" containerID="fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.600949 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6\": container with ID starting with fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6 not found: ID does not exist" containerID="fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600968 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6"} err="failed to get container status \"fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6\": rpc error: code = NotFound desc = could not find container \"fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6\": container with ID starting with fee8281668fd0cef347a2cc4b0d529170a8a54372656a674cada64e7758b6de6 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.600982 4735 scope.go:117] "RemoveContainer" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.621275 4735 scope.go:117] "RemoveContainer" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.649992 4735 scope.go:117] "RemoveContainer" containerID="679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.674749 4735 scope.go:117] "RemoveContainer" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.675269 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773\": container with ID starting with dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773 not found: ID does not exist" containerID="dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.675305 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773"} err="failed to get container status \"dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773\": rpc error: code = NotFound desc = could not find container \"dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773\": container with ID starting with dba547c713f3dc3fb4b7cd134966f3012d70fd6781682f3b57fcf8290868f773 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.675331 4735 scope.go:117] "RemoveContainer" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.675734 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883\": container with ID starting with 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 not found: ID does not exist" containerID="0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.675756 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883"} err="failed to get container status \"0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883\": rpc error: code = NotFound desc = could not find container \"0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883\": container with ID starting with 0c32b0561e314aa74f9602cc47a55588e33447b929f0779af7d6ee25ee8ff883 not found: ID does not exist" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.675770 4735 scope.go:117] "RemoveContainer" containerID="679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee" Oct 08 13:19:51 crc kubenswrapper[4735]: E1008 13:19:51.676018 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee\": container with ID starting with 679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee not found: ID does not exist" containerID="679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee" Oct 08 13:19:51 crc kubenswrapper[4735]: I1008 13:19:51.676035 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee"} err="failed to get container status \"679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee\": rpc error: code = NotFound desc = could not find container \"679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee\": container with ID starting with 679f2e8556db1c2268b078a0be0be92aab3d54643a41d63dd4c9ad894c440cee not found: ID does not exist" Oct 08 13:19:52 crc kubenswrapper[4735]: I1008 13:19:52.903502 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" path="/var/lib/kubelet/pods/22f4e804-ab3e-4858-87b4-84a310c21249/volumes" Oct 08 13:19:52 crc kubenswrapper[4735]: I1008 13:19:52.904535 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" path="/var/lib/kubelet/pods/b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb/volumes" Oct 08 13:19:52 crc kubenswrapper[4735]: I1008 13:19:52.905186 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" path="/var/lib/kubelet/pods/dc4e9066-e538-4d60-9c5a-3beb689d67a4/volumes" Oct 08 13:19:53 crc kubenswrapper[4735]: I1008 13:19:53.249657 4735 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod27af208c-711b-4423-bae1-91b55e84e14a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod27af208c-711b-4423-bae1-91b55e84e14a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod27af208c_711b_4423_bae1_91b55e84e14a.slice" Oct 08 13:19:53 crc kubenswrapper[4735]: E1008 13:19:53.249722 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod27af208c-711b-4423-bae1-91b55e84e14a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod27af208c-711b-4423-bae1-91b55e84e14a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod27af208c_711b_4423_bae1_91b55e84e14a.slice" pod="openstack/openstackclient" podUID="27af208c-711b-4423-bae1-91b55e84e14a" Oct 08 13:19:53 crc kubenswrapper[4735]: I1008 13:19:53.252808 4735 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod647cfc89-264d-4616-ab81-64a7c9aa51c5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod647cfc89-264d-4616-ab81-64a7c9aa51c5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod647cfc89_264d_4616_ab81_64a7c9aa51c5.slice" Oct 08 13:19:53 crc kubenswrapper[4735]: E1008 13:19:53.252954 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod647cfc89-264d-4616-ab81-64a7c9aa51c5] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod647cfc89-264d-4616-ab81-64a7c9aa51c5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod647cfc89_264d_4616_ab81_64a7c9aa51c5.slice" pod="openstack/ovsdbserver-sb-0" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" Oct 08 13:19:53 crc kubenswrapper[4735]: I1008 13:19:53.265940 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 13:19:53 crc kubenswrapper[4735]: I1008 13:19:53.265960 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 13:19:53 crc kubenswrapper[4735]: I1008 13:19:53.294039 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 13:19:53 crc kubenswrapper[4735]: I1008 13:19:53.301211 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 13:19:54 crc kubenswrapper[4735]: I1008 13:19:54.878514 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:19:54 crc kubenswrapper[4735]: I1008 13:19:54.878794 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:19:54 crc kubenswrapper[4735]: I1008 13:19:54.903625 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" path="/var/lib/kubelet/pods/647cfc89-264d-4616-ab81-64a7c9aa51c5/volumes" Oct 08 13:20:24 crc kubenswrapper[4735]: I1008 13:20:24.879114 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:20:24 crc kubenswrapper[4735]: I1008 13:20:24.879776 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:20:24 crc kubenswrapper[4735]: I1008 13:20:24.879845 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:20:24 crc kubenswrapper[4735]: I1008 13:20:24.880565 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"271e91019953de2464d625764fc5ce2f322cb5494932f22012a17b3cd3952c78"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:20:24 crc kubenswrapper[4735]: I1008 13:20:24.880634 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://271e91019953de2464d625764fc5ce2f322cb5494932f22012a17b3cd3952c78" gracePeriod=600 Oct 08 13:20:25 crc kubenswrapper[4735]: I1008 13:20:25.564235 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="271e91019953de2464d625764fc5ce2f322cb5494932f22012a17b3cd3952c78" exitCode=0 Oct 08 13:20:25 crc kubenswrapper[4735]: I1008 13:20:25.564302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"271e91019953de2464d625764fc5ce2f322cb5494932f22012a17b3cd3952c78"} Oct 08 13:20:25 crc kubenswrapper[4735]: I1008 13:20:25.564762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a"} Oct 08 13:20:25 crc kubenswrapper[4735]: I1008 13:20:25.564808 4735 scope.go:117] "RemoveContainer" containerID="9f7b3e9d44554f3c931f516096c49ab670e3d36bde045f3e21679319bf30124a" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.802767 4735 scope.go:117] "RemoveContainer" containerID="18ecea8e5b37f608600ac08acead0be77a04c7cd2fcb23d381bcbe859074add1" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.837309 4735 scope.go:117] "RemoveContainer" containerID="213b4bd5d68f294cd07d89000b52b41d82b1fab369579e70e28047513a003eba" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.863296 4735 scope.go:117] "RemoveContainer" containerID="2216655515a74b06489f85b13335788b90aae03c4ad2036f42b36ac4d920c85b" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.899587 4735 scope.go:117] "RemoveContainer" containerID="8ad590cd163cd0b833533bdba0e94515d2b83d9f6ef232c27857edbcdd0d8dc7" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.937218 4735 scope.go:117] "RemoveContainer" containerID="6d1206de05e8eb6d4246b908ca40c1044b37559d02183b92feb25fb16c4e9316" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.963908 4735 scope.go:117] "RemoveContainer" containerID="bbca8dc5ac414db94afe3dc18ae340927ca57130f3e94f5435899ebab773806e" Oct 08 13:21:04 crc kubenswrapper[4735]: I1008 13:21:04.994781 4735 scope.go:117] "RemoveContainer" containerID="33ec5f1c7804a57807089942addba2fa4640a3d316649a488561eedd5c819134" Oct 08 13:21:05 crc kubenswrapper[4735]: I1008 13:21:05.014009 4735 scope.go:117] "RemoveContainer" containerID="bf84946c5afe4f0e453aa213bd2807af5b2ca49428a3592a40e601015d6e8947" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.891751 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mbbhw"] Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892804 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.892827 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892844 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.892857 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892877 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.892889 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892910 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-updater" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.892921 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-updater" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892939 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.892949 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892968 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4189bc5a-2f23-4b53-9df4-e97a65a7d8eb" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.892979 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4189bc5a-2f23-4b53-9df4-e97a65a7d8eb" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.892992 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893004 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-api" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893022 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-central-agent" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893032 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-central-agent" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893051 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerName="mysql-bootstrap" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893062 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerName="mysql-bootstrap" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893073 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893083 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-server" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893105 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6927451-d206-4948-a43e-5b4ee59dddcc" containerName="memcached" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893118 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6927451-d206-4948-a43e-5b4ee59dddcc" containerName="memcached" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893139 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerName="galera" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893149 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerName="galera" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893172 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893182 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893202 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92cd291-9315-4b44-b318-6f029f253de6" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893213 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92cd291-9315-4b44-b318-6f029f253de6" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893230 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893241 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893260 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="rabbitmq" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893272 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="rabbitmq" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893288 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893299 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893319 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893329 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893345 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerName="dnsmasq-dns" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893355 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerName="dnsmasq-dns" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893376 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e294ddd-2861-4cfb-97b5-d4283cc37e4e" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893387 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e294ddd-2861-4cfb-97b5-d4283cc37e4e" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893408 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="ovsdbserver-nb" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893418 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="ovsdbserver-nb" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893439 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="probe" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893450 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="probe" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893469 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893501 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893522 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="sg-core" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893532 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="sg-core" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893554 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53890bab-61cc-4f19-908a-33c085da8d33" containerName="nova-scheduler-scheduler" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893566 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="53890bab-61cc-4f19-908a-33c085da8d33" containerName="nova-scheduler-scheduler" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893581 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893591 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-api" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893614 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893625 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893645 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" containerName="kube-state-metrics" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893656 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" containerName="kube-state-metrics" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893676 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893688 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893710 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="setup-container" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893720 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="setup-container" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893738 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893748 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-server" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893760 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893771 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893789 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-expirer" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893800 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-expirer" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893816 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server-init" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893826 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server-init" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893847 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893858 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893880 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893891 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893911 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893923 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893943 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893954 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893970 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-updater" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.893981 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-updater" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.893994 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerName="setup-container" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894005 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerName="setup-container" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894021 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6e2da8-2dc0-4642-b216-218ed61cc813" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894031 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6e2da8-2dc0-4642-b216-218ed61cc813" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894047 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="mysql-bootstrap" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894057 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="mysql-bootstrap" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894075 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-metadata" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894086 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-metadata" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894105 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477f295c-5e96-4f47-8caa-f7d36b074cdd" containerName="keystone-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894116 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="477f295c-5e96-4f47-8caa-f7d36b074cdd" containerName="keystone-api" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894150 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894162 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894176 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-notification-agent" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894187 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-notification-agent" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894205 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="extract-utilities" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894214 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="extract-utilities" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894225 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894234 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-server" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894248 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="rsync" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894255 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="rsync" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894268 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerName="ovn-controller" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894276 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerName="ovn-controller" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894286 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84fa39bb-f66a-49ed-960c-7598ec87854a" containerName="nova-cell0-conductor-conductor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894294 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fa39bb-f66a-49ed-960c-7598ec87854a" containerName="nova-cell0-conductor-conductor" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894305 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894313 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894323 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894331 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894349 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894357 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894372 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894380 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894395 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="cinder-scheduler" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894403 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="cinder-scheduler" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894414 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894423 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894437 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894445 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894454 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894462 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894471 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="galera" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894560 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="galera" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894575 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="swift-recon-cron" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894584 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="swift-recon-cron" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894595 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerName="init" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894603 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerName="init" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894617 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-reaper" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894625 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-reaper" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894640 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894650 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894664 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31465d06-bf56-4173-95c1-3c13d776b1f9" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894675 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="31465d06-bf56-4173-95c1-3c13d776b1f9" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894691 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="proxy-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894702 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="proxy-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894713 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894724 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894746 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerName="rabbitmq" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894754 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerName="rabbitmq" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894767 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894776 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-api" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894792 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="extract-content" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894801 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="extract-content" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894815 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f952e88e-fe68-4770-874c-fdee88d43aee" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894824 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f952e88e-fe68-4770-874c-fdee88d43aee" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894835 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894844 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894857 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894865 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894878 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0172640d-d560-4ef0-a7e4-b649ccfa92e4" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894887 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0172640d-d560-4ef0-a7e4-b649ccfa92e4" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894899 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8953f639-e468-4f85-8093-6c29fa380d8a" containerName="nova-cell1-conductor-conductor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894907 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8953f639-e468-4f85-8093-6c29fa380d8a" containerName="nova-cell1-conductor-conductor" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894917 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="ovsdbserver-sb" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894925 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="ovsdbserver-sb" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894934 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894942 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894952 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894960 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-log" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894974 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.894983 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-server" Oct 08 13:21:22 crc kubenswrapper[4735]: E1008 13:21:22.894994 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe2bc1d-7e65-45ba-9d42-cae899859258" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895002 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe2bc1d-7e65-45ba-9d42-cae899859258" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895188 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="644f0099-ec9c-42af-bbe8-54fe5b603bb3" containerName="rabbitmq" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895202 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e2103e-dd2c-4e18-b66b-0555a1c8f43a" containerName="kube-state-metrics" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895218 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="probe" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895230 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895246 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895262 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e294ddd-2861-4cfb-97b5-d4283cc37e4e" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895272 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895285 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c473e5-0132-4ee7-a14e-ada0780d358c" containerName="glance-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895297 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ad2819-cfc3-4416-8a27-ecb65c2fa366" containerName="rabbitmq" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895305 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895316 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895329 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895341 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895349 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="84fa39bb-f66a-49ed-960c-7598ec87854a" containerName="nova-cell0-conductor-conductor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895359 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92cd291-9315-4b44-b318-6f029f253de6" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895371 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8953f639-e468-4f85-8093-6c29fa380d8a" containerName="nova-cell1-conductor-conductor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895423 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2868ea84-fef0-4728-9a62-e2ba97a9aae1" containerName="galera" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895433 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895445 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada58119-3f96-4962-b214-fb2662c3d935" containerName="barbican-worker-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895453 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895462 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f333daa-559f-421a-b0a9-fa1de98b72cf" containerName="dnsmasq-dns" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895490 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-central-agent" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895504 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovsdb-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895516 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895526 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="477f295c-5e96-4f47-8caa-f7d36b074cdd" containerName="keystone-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895538 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895546 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f4e804-ab3e-4858-87b4-84a310c21249" containerName="cinder-scheduler" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895561 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="swift-recon-cron" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895575 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4189bc5a-2f23-4b53-9df4-e97a65a7d8eb" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895588 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895602 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-updater" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895613 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="983bfc3d-43e8-4190-854d-c3e9497754ae" containerName="registry-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895623 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="53890bab-61cc-4f19-908a-33c085da8d33" containerName="nova-scheduler-scheduler" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895632 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="31465d06-bf56-4173-95c1-3c13d776b1f9" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895645 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0172640d-d560-4ef0-a7e4-b649ccfa92e4" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895657 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-reaper" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895668 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895683 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895693 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895702 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895712 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="70450186-9a74-4ba9-92f9-3f3a9626d0a0" containerName="placement-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895729 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-expirer" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895754 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-updater" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895772 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="rsync" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895785 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="97384a50-9f20-4cbf-8b02-afffe5ddc791" containerName="nova-api-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895800 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="object-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895818 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0cb7948-9a3d-40fa-b4e7-bb6b765af52f" containerName="glance-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895832 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="ceilometer-notification-agent" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895846 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895856 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc218ce-b4bf-4477-afaa-ff9e78a7c1d2" containerName="cinder-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895867 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="proxy-httpd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895885 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca78485a-0626-420e-aa59-a9a4680df83c" containerName="barbican-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895900 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2428d97-aa22-413f-ab58-5b077fdde95c" containerName="galera" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895916 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895929 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ecad81-3dfd-4f36-be79-699b6bd30e90" containerName="neutron-api" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895942 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="647cfc89-264d-4616-ab81-64a7c9aa51c5" containerName="ovsdbserver-sb" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895960 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5516bd19-016a-4d52-84c9-b3801e1841ca" containerName="proxy-server" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895974 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b19a4f-4794-49fc-bfd9-727618f11e7c" containerName="ovn-controller" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895983 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="openstack-network-exporter" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.895993 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe2bc1d-7e65-45ba-9d42-cae899859258" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896003 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d3f8ec-fa4f-4d33-8a33-eac3edce38eb" containerName="ovs-vswitchd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896016 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1303e4-c7b5-456a-aba6-c4674b8bcfed" containerName="sg-core" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896028 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f1329e-e0dc-4295-8dfc-92fae02065f2" containerName="ovsdbserver-nb" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896037 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a1e0f9-279f-4cb5-9e9f-fd91fc39c3a6" containerName="ovn-northd" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896050 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b111975-076c-4c7f-bcbd-ecef0b62936f" containerName="barbican-keystone-listener-log" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896061 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f952e88e-fe68-4770-874c-fdee88d43aee" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896072 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-replicator" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896081 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6e2da8-2dc0-4642-b216-218ed61cc813" containerName="mariadb-account-delete" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896091 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="container-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896102 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db01523-cd76-4a9d-8063-e024dd919879" containerName="nova-metadata-metadata" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896115 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4e9066-e538-4d60-9c5a-3beb689d67a4" containerName="account-auditor" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.896125 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6927451-d206-4948-a43e-5b4ee59dddcc" containerName="memcached" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.897550 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.913184 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbbhw"] Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.992346 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-utilities\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.992398 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdwrg\" (UniqueName: \"kubernetes.io/projected/016286f0-c81e-403e-a3b7-b839f7e41f78-kube-api-access-hdwrg\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:22 crc kubenswrapper[4735]: I1008 13:21:22.992546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-catalog-content\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.094637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-utilities\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.094702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdwrg\" (UniqueName: \"kubernetes.io/projected/016286f0-c81e-403e-a3b7-b839f7e41f78-kube-api-access-hdwrg\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.094770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-catalog-content\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.095301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-catalog-content\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.095588 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-utilities\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.116223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdwrg\" (UniqueName: \"kubernetes.io/projected/016286f0-c81e-403e-a3b7-b839f7e41f78-kube-api-access-hdwrg\") pod \"certified-operators-mbbhw\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.218932 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:23 crc kubenswrapper[4735]: I1008 13:21:23.729313 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbbhw"] Oct 08 13:21:24 crc kubenswrapper[4735]: I1008 13:21:24.054020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerStarted","Data":"b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9"} Oct 08 13:21:24 crc kubenswrapper[4735]: I1008 13:21:24.054084 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerStarted","Data":"64427ad696b5c7d283eb49c1fd892fba677ab2e4bda8d1ee1f985967a80b0bf7"} Oct 08 13:21:25 crc kubenswrapper[4735]: I1008 13:21:25.064248 4735 generic.go:334] "Generic (PLEG): container finished" podID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerID="b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9" exitCode=0 Oct 08 13:21:25 crc kubenswrapper[4735]: I1008 13:21:25.064575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerDied","Data":"b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9"} Oct 08 13:21:26 crc kubenswrapper[4735]: I1008 13:21:26.076094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerStarted","Data":"e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4"} Oct 08 13:21:29 crc kubenswrapper[4735]: I1008 13:21:29.105186 4735 generic.go:334] "Generic (PLEG): container finished" podID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerID="e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4" exitCode=0 Oct 08 13:21:29 crc kubenswrapper[4735]: I1008 13:21:29.105269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerDied","Data":"e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4"} Oct 08 13:21:30 crc kubenswrapper[4735]: I1008 13:21:30.116195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerStarted","Data":"1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90"} Oct 08 13:21:30 crc kubenswrapper[4735]: I1008 13:21:30.135620 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mbbhw" podStartSLOduration=3.5348914750000002 podStartE2EDuration="8.135599482s" podCreationTimestamp="2025-10-08 13:21:22 +0000 UTC" firstStartedPulling="2025-10-08 13:21:25.0661712 +0000 UTC m=+1534.874475744" lastFinishedPulling="2025-10-08 13:21:29.666879207 +0000 UTC m=+1539.475183751" observedRunningTime="2025-10-08 13:21:30.134689229 +0000 UTC m=+1539.942993773" watchObservedRunningTime="2025-10-08 13:21:30.135599482 +0000 UTC m=+1539.943904036" Oct 08 13:21:33 crc kubenswrapper[4735]: I1008 13:21:33.219552 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:33 crc kubenswrapper[4735]: I1008 13:21:33.219908 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:33 crc kubenswrapper[4735]: I1008 13:21:33.258554 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:34 crc kubenswrapper[4735]: I1008 13:21:34.194365 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:34 crc kubenswrapper[4735]: I1008 13:21:34.248894 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbbhw"] Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.170068 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mbbhw" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="registry-server" containerID="cri-o://1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90" gracePeriod=2 Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.677337 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.876518 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdwrg\" (UniqueName: \"kubernetes.io/projected/016286f0-c81e-403e-a3b7-b839f7e41f78-kube-api-access-hdwrg\") pod \"016286f0-c81e-403e-a3b7-b839f7e41f78\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.876600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-utilities\") pod \"016286f0-c81e-403e-a3b7-b839f7e41f78\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.876652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-catalog-content\") pod \"016286f0-c81e-403e-a3b7-b839f7e41f78\" (UID: \"016286f0-c81e-403e-a3b7-b839f7e41f78\") " Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.878084 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-utilities" (OuterVolumeSpecName: "utilities") pod "016286f0-c81e-403e-a3b7-b839f7e41f78" (UID: "016286f0-c81e-403e-a3b7-b839f7e41f78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.885709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016286f0-c81e-403e-a3b7-b839f7e41f78-kube-api-access-hdwrg" (OuterVolumeSpecName: "kube-api-access-hdwrg") pod "016286f0-c81e-403e-a3b7-b839f7e41f78" (UID: "016286f0-c81e-403e-a3b7-b839f7e41f78"). InnerVolumeSpecName "kube-api-access-hdwrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.930410 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "016286f0-c81e-403e-a3b7-b839f7e41f78" (UID: "016286f0-c81e-403e-a3b7-b839f7e41f78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.977931 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdwrg\" (UniqueName: \"kubernetes.io/projected/016286f0-c81e-403e-a3b7-b839f7e41f78-kube-api-access-hdwrg\") on node \"crc\" DevicePath \"\"" Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.977964 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:21:36 crc kubenswrapper[4735]: I1008 13:21:36.977973 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/016286f0-c81e-403e-a3b7-b839f7e41f78-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.180950 4735 generic.go:334] "Generic (PLEG): container finished" podID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerID="1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90" exitCode=0 Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.180991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerDied","Data":"1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90"} Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.181003 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbbhw" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.181014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbbhw" event={"ID":"016286f0-c81e-403e-a3b7-b839f7e41f78","Type":"ContainerDied","Data":"64427ad696b5c7d283eb49c1fd892fba677ab2e4bda8d1ee1f985967a80b0bf7"} Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.181030 4735 scope.go:117] "RemoveContainer" containerID="1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.204390 4735 scope.go:117] "RemoveContainer" containerID="e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.231931 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbbhw"] Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.235387 4735 scope.go:117] "RemoveContainer" containerID="b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.237625 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mbbhw"] Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.257588 4735 scope.go:117] "RemoveContainer" containerID="1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90" Oct 08 13:21:37 crc kubenswrapper[4735]: E1008 13:21:37.258085 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90\": container with ID starting with 1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90 not found: ID does not exist" containerID="1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.258189 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90"} err="failed to get container status \"1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90\": rpc error: code = NotFound desc = could not find container \"1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90\": container with ID starting with 1f6752706c90404f08bf09291c2e3f752f0d54feaf21d9dad355fff1e38fcd90 not found: ID does not exist" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.258266 4735 scope.go:117] "RemoveContainer" containerID="e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4" Oct 08 13:21:37 crc kubenswrapper[4735]: E1008 13:21:37.258880 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4\": container with ID starting with e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4 not found: ID does not exist" containerID="e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.258961 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4"} err="failed to get container status \"e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4\": rpc error: code = NotFound desc = could not find container \"e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4\": container with ID starting with e23820e0ba5596e64b3b02008553adbe97b067c89dd07ad815a8d5feee2107d4 not found: ID does not exist" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.259027 4735 scope.go:117] "RemoveContainer" containerID="b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9" Oct 08 13:21:37 crc kubenswrapper[4735]: E1008 13:21:37.259775 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9\": container with ID starting with b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9 not found: ID does not exist" containerID="b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9" Oct 08 13:21:37 crc kubenswrapper[4735]: I1008 13:21:37.259815 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9"} err="failed to get container status \"b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9\": rpc error: code = NotFound desc = could not find container \"b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9\": container with ID starting with b64cfdc4c3ea137f3e978394f83d6f0bb4768fc3dc2ee8d60a80ccce6dc423b9 not found: ID does not exist" Oct 08 13:21:38 crc kubenswrapper[4735]: I1008 13:21:38.907745 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" path="/var/lib/kubelet/pods/016286f0-c81e-403e-a3b7-b839f7e41f78/volumes" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.152361 4735 scope.go:117] "RemoveContainer" containerID="273e0d1ce3c1b0a4083ae611ee6898246e1579f6035cac87c7041c620acb2e9a" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.206035 4735 scope.go:117] "RemoveContainer" containerID="bba8d83bf71746e7c9b712ac5f41c4a3c7f8f79408257bfec81776b07434b105" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.236106 4735 scope.go:117] "RemoveContainer" containerID="a1c40defa3c248eddcb567968509785acce19b04b60e353a43058057f878389b" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.264527 4735 scope.go:117] "RemoveContainer" containerID="ad46567e2456e5a5fb2080fb61bdaa993642c8e9b3b19d2bc7ea50152d6ea11f" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.281001 4735 scope.go:117] "RemoveContainer" containerID="d44500aa9959aa2af8745d6e474b68bd1fe99246472ccf2f9cfe86cbabc06b44" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.304846 4735 scope.go:117] "RemoveContainer" containerID="d8c14d818fd4ac6e52acfc43062590356f1e962dabe5c30592a2eb3027552ffb" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.325724 4735 scope.go:117] "RemoveContainer" containerID="d9517d4b5ff0275fae51dae0a8e59801f758f590db88101f461635ef9b151716" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.344945 4735 scope.go:117] "RemoveContainer" containerID="0dbfa050b749ea1a57d90f160065f4738b25279a7a23eb2307f4e32ca745f837" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.371190 4735 scope.go:117] "RemoveContainer" containerID="ba1c205cb24ae21c686b973dc27c7541bf68b8cd19fdea9dbc395f5136814148" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.406555 4735 scope.go:117] "RemoveContainer" containerID="9394c36d6bc15b7aa1942d29aa09ff824a385998e4098ce00e974bd466bdecfa" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.438224 4735 scope.go:117] "RemoveContainer" containerID="dfdd55aeb2694674bcacaeeb6387ca20d7fdd7bb448499e8470ad6f6af7c1dbc" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.452983 4735 scope.go:117] "RemoveContainer" containerID="331ca35792c751b799fe82a15c0ba601de01652826733a0f1bc7a6370b61c05c" Oct 08 13:22:05 crc kubenswrapper[4735]: I1008 13:22:05.472017 4735 scope.go:117] "RemoveContainer" containerID="b4a38bdd8c366bbada82e4050eef0f1ba2e019fa6882c001d2e9d49612679698" Oct 08 13:22:54 crc kubenswrapper[4735]: I1008 13:22:54.878614 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:22:54 crc kubenswrapper[4735]: I1008 13:22:54.879285 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.679184 4735 scope.go:117] "RemoveContainer" containerID="caf2689e62a63c0e957ec5e45b184662c220651a79a995c9d7efc5643cfa216b" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.702331 4735 scope.go:117] "RemoveContainer" containerID="6868ac5109c85f569ae6a98b468fee4259afd8e4d4e83f57546cf9baa70e2595" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.736595 4735 scope.go:117] "RemoveContainer" containerID="96ea68b8faaa1a0dac49635d32f8379baed6ab70cff4b930e0003c6cce8a448f" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.765740 4735 scope.go:117] "RemoveContainer" containerID="7d7c87d703a723cb263ee5030a259e665d68c85efbd777b9a5b704b5476ab749" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.802559 4735 scope.go:117] "RemoveContainer" containerID="64dfaa064ce1bff941b3220dbad48d2a7c10dcdf642bb16be664f91279669cb6" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.825694 4735 scope.go:117] "RemoveContainer" containerID="6d64df6cc879792395fce1a8a219ef400d1f7cfe418d3ee5312fec24e31fc592" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.849555 4735 scope.go:117] "RemoveContainer" containerID="ec5c9443ed714cead8c92fc82a4368ded3008d6ad808a146f6e44d78449d3dce" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.865081 4735 scope.go:117] "RemoveContainer" containerID="f878591b24e1383c6d3748a96ead59b9731b8220a95ec68982db1a4c5a813e03" Oct 08 13:23:05 crc kubenswrapper[4735]: I1008 13:23:05.904390 4735 scope.go:117] "RemoveContainer" containerID="291cb29e3cba296e0d831cdae9bfa80b48fc271ca1f0dd6ea086ecdd10b45c37" Oct 08 13:23:24 crc kubenswrapper[4735]: I1008 13:23:24.879685 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:23:24 crc kubenswrapper[4735]: I1008 13:23:24.880298 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:23:54 crc kubenswrapper[4735]: I1008 13:23:54.878911 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:23:54 crc kubenswrapper[4735]: I1008 13:23:54.879562 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:23:54 crc kubenswrapper[4735]: I1008 13:23:54.879623 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:23:54 crc kubenswrapper[4735]: I1008 13:23:54.880320 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:23:54 crc kubenswrapper[4735]: I1008 13:23:54.880389 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" gracePeriod=600 Oct 08 13:23:55 crc kubenswrapper[4735]: E1008 13:23:55.002124 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:23:55 crc kubenswrapper[4735]: I1008 13:23:55.332953 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" exitCode=0 Oct 08 13:23:55 crc kubenswrapper[4735]: I1008 13:23:55.333002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a"} Oct 08 13:23:55 crc kubenswrapper[4735]: I1008 13:23:55.333038 4735 scope.go:117] "RemoveContainer" containerID="271e91019953de2464d625764fc5ce2f322cb5494932f22012a17b3cd3952c78" Oct 08 13:23:55 crc kubenswrapper[4735]: I1008 13:23:55.333652 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:23:55 crc kubenswrapper[4735]: E1008 13:23:55.334007 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:24:06 crc kubenswrapper[4735]: I1008 13:24:06.013015 4735 scope.go:117] "RemoveContainer" containerID="384094abefaa07d55b60c391c4f6b9a89bb4c01f9a5256e31cc3311408cadb4b" Oct 08 13:24:06 crc kubenswrapper[4735]: I1008 13:24:06.041525 4735 scope.go:117] "RemoveContainer" containerID="f37619629254693188af3bf21fb4741a55f07777900aae9d0cdad72ae36cb38d" Oct 08 13:24:06 crc kubenswrapper[4735]: I1008 13:24:06.059021 4735 scope.go:117] "RemoveContainer" containerID="c8f2657ed8007a1620dd173bcf75075ec5b0c928943946ed6787c2bfb93faaa4" Oct 08 13:24:06 crc kubenswrapper[4735]: I1008 13:24:06.089367 4735 scope.go:117] "RemoveContainer" containerID="14d921e0baa7ca18db075fc0465d68042c4943f3d81357b3ee8a1af87fd6d390" Oct 08 13:24:06 crc kubenswrapper[4735]: I1008 13:24:06.125753 4735 scope.go:117] "RemoveContainer" containerID="fe0b50aebd53217e466df2d09bfb549c0d6a2577d3c2470fded5a0c5db88732e" Oct 08 13:24:06 crc kubenswrapper[4735]: I1008 13:24:06.156582 4735 scope.go:117] "RemoveContainer" containerID="bfdc2bbddf5a559da3c78a8cb972c55f12ec5e9b3375226be6e18094289c9ea4" Oct 08 13:24:08 crc kubenswrapper[4735]: I1008 13:24:08.894958 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:24:08 crc kubenswrapper[4735]: E1008 13:24:08.895532 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:24:20 crc kubenswrapper[4735]: I1008 13:24:20.899841 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:24:20 crc kubenswrapper[4735]: E1008 13:24:20.900650 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:24:32 crc kubenswrapper[4735]: I1008 13:24:32.894838 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:24:32 crc kubenswrapper[4735]: E1008 13:24:32.895724 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:24:47 crc kubenswrapper[4735]: I1008 13:24:47.895422 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:24:47 crc kubenswrapper[4735]: E1008 13:24:47.896146 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:24:59 crc kubenswrapper[4735]: I1008 13:24:59.894990 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:24:59 crc kubenswrapper[4735]: E1008 13:24:59.895680 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:25:06 crc kubenswrapper[4735]: I1008 13:25:06.240294 4735 scope.go:117] "RemoveContainer" containerID="7d7ad1478ae29b27029b1887e1c33fab874aabb21d8774256a3139c949c54c49" Oct 08 13:25:06 crc kubenswrapper[4735]: I1008 13:25:06.257917 4735 scope.go:117] "RemoveContainer" containerID="4687666d9fb769e55d9e2662909018e3d99681ac811ca1ac37fcba325c3e95ca" Oct 08 13:25:13 crc kubenswrapper[4735]: I1008 13:25:13.894312 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:25:13 crc kubenswrapper[4735]: E1008 13:25:13.895588 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:25:25 crc kubenswrapper[4735]: I1008 13:25:25.895463 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:25:25 crc kubenswrapper[4735]: E1008 13:25:25.896404 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:25:40 crc kubenswrapper[4735]: I1008 13:25:40.900757 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:25:40 crc kubenswrapper[4735]: E1008 13:25:40.901607 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:25:54 crc kubenswrapper[4735]: I1008 13:25:54.894427 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:25:54 crc kubenswrapper[4735]: E1008 13:25:54.895247 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:26:05 crc kubenswrapper[4735]: I1008 13:26:05.895018 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:26:05 crc kubenswrapper[4735]: E1008 13:26:05.896210 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:26:06 crc kubenswrapper[4735]: I1008 13:26:06.374132 4735 scope.go:117] "RemoveContainer" containerID="12338dc1882c13a1adb79c38b8e613ffdac5d41d70ba8479854eb311be932cc8" Oct 08 13:26:06 crc kubenswrapper[4735]: I1008 13:26:06.410870 4735 scope.go:117] "RemoveContainer" containerID="f91e04b6239fa23dd178148df72ab2a06405051f7c6cb712411b935fbe3a9052" Oct 08 13:26:06 crc kubenswrapper[4735]: I1008 13:26:06.429305 4735 scope.go:117] "RemoveContainer" containerID="650fdbee91d8f7c999b8abf663b1b798642af3b4a56514b0fb884a65ddb1a06f" Oct 08 13:26:20 crc kubenswrapper[4735]: I1008 13:26:20.911604 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:26:20 crc kubenswrapper[4735]: E1008 13:26:20.912348 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:26:33 crc kubenswrapper[4735]: I1008 13:26:33.894849 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:26:33 crc kubenswrapper[4735]: E1008 13:26:33.895559 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.756821 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z27b5"] Oct 08 13:26:37 crc kubenswrapper[4735]: E1008 13:26:37.757372 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="registry-server" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.757387 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="registry-server" Oct 08 13:26:37 crc kubenswrapper[4735]: E1008 13:26:37.757411 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="extract-content" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.757419 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="extract-content" Oct 08 13:26:37 crc kubenswrapper[4735]: E1008 13:26:37.757435 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="extract-utilities" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.757442 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="extract-utilities" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.757888 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="016286f0-c81e-403e-a3b7-b839f7e41f78" containerName="registry-server" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.758998 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.769645 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z27b5"] Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.793705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dbqm\" (UniqueName: \"kubernetes.io/projected/82edd02f-5f52-4296-843b-edf563200302-kube-api-access-7dbqm\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.793840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-utilities\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.794033 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-catalog-content\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.894608 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-catalog-content\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.894683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dbqm\" (UniqueName: \"kubernetes.io/projected/82edd02f-5f52-4296-843b-edf563200302-kube-api-access-7dbqm\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.894706 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-utilities\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.895163 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-catalog-content\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.895174 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-utilities\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.919146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dbqm\" (UniqueName: \"kubernetes.io/projected/82edd02f-5f52-4296-843b-edf563200302-kube-api-access-7dbqm\") pod \"redhat-marketplace-z27b5\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.953569 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-plw4g"] Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.954982 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.974261 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plw4g"] Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.996288 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68ht8\" (UniqueName: \"kubernetes.io/projected/b3046b2f-2ce4-473c-a76d-8931d1640182-kube-api-access-68ht8\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.996346 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-catalog-content\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:37 crc kubenswrapper[4735]: I1008 13:26:37.996455 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-utilities\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.088079 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.098412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68ht8\" (UniqueName: \"kubernetes.io/projected/b3046b2f-2ce4-473c-a76d-8931d1640182-kube-api-access-68ht8\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.098458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-catalog-content\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.098590 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-utilities\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.099181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-catalog-content\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.099277 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-utilities\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.118925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68ht8\" (UniqueName: \"kubernetes.io/projected/b3046b2f-2ce4-473c-a76d-8931d1640182-kube-api-access-68ht8\") pod \"community-operators-plw4g\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.277941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.591803 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z27b5"] Oct 08 13:26:38 crc kubenswrapper[4735]: I1008 13:26:38.792651 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plw4g"] Oct 08 13:26:38 crc kubenswrapper[4735]: W1008 13:26:38.808664 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3046b2f_2ce4_473c_a76d_8931d1640182.slice/crio-dfd4f60284fa961dd46d326fc900adba0656d3428d9366aadce44ee120120367 WatchSource:0}: Error finding container dfd4f60284fa961dd46d326fc900adba0656d3428d9366aadce44ee120120367: Status 404 returned error can't find the container with id dfd4f60284fa961dd46d326fc900adba0656d3428d9366aadce44ee120120367 Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.594754 4735 generic.go:334] "Generic (PLEG): container finished" podID="82edd02f-5f52-4296-843b-edf563200302" containerID="e37c18d1cd98a99965e517d7dd26fd6e603920df01698e08d8f480f7e3195d41" exitCode=0 Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.594818 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z27b5" event={"ID":"82edd02f-5f52-4296-843b-edf563200302","Type":"ContainerDied","Data":"e37c18d1cd98a99965e517d7dd26fd6e603920df01698e08d8f480f7e3195d41"} Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.594848 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z27b5" event={"ID":"82edd02f-5f52-4296-843b-edf563200302","Type":"ContainerStarted","Data":"2fe98f73fd971b5fbd992700521a08c9991888ee8e6c6a6f54fca09bd1dfe4dd"} Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.596447 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.598191 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerID="fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093" exitCode=0 Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.598231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerDied","Data":"fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093"} Oct 08 13:26:39 crc kubenswrapper[4735]: I1008 13:26:39.598254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerStarted","Data":"dfd4f60284fa961dd46d326fc900adba0656d3428d9366aadce44ee120120367"} Oct 08 13:26:40 crc kubenswrapper[4735]: I1008 13:26:40.609077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerStarted","Data":"e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a"} Oct 08 13:26:40 crc kubenswrapper[4735]: I1008 13:26:40.611413 4735 generic.go:334] "Generic (PLEG): container finished" podID="82edd02f-5f52-4296-843b-edf563200302" containerID="006b5a96be254bb8ff4d5389f91629e0c7707198470de87da7eeaa559a3f0a4c" exitCode=0 Oct 08 13:26:40 crc kubenswrapper[4735]: I1008 13:26:40.611523 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z27b5" event={"ID":"82edd02f-5f52-4296-843b-edf563200302","Type":"ContainerDied","Data":"006b5a96be254bb8ff4d5389f91629e0c7707198470de87da7eeaa559a3f0a4c"} Oct 08 13:26:41 crc kubenswrapper[4735]: I1008 13:26:41.620267 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerID="e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a" exitCode=0 Oct 08 13:26:41 crc kubenswrapper[4735]: I1008 13:26:41.620384 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerDied","Data":"e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a"} Oct 08 13:26:41 crc kubenswrapper[4735]: I1008 13:26:41.624405 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z27b5" event={"ID":"82edd02f-5f52-4296-843b-edf563200302","Type":"ContainerStarted","Data":"1bf465fb4c4fcbc6efa26bf34063d142634d521cdf78a33f2f66f26810b6db1c"} Oct 08 13:26:41 crc kubenswrapper[4735]: I1008 13:26:41.666684 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z27b5" podStartSLOduration=3.203282687 podStartE2EDuration="4.666662594s" podCreationTimestamp="2025-10-08 13:26:37 +0000 UTC" firstStartedPulling="2025-10-08 13:26:39.596249421 +0000 UTC m=+1849.404553965" lastFinishedPulling="2025-10-08 13:26:41.059629328 +0000 UTC m=+1850.867933872" observedRunningTime="2025-10-08 13:26:41.660211808 +0000 UTC m=+1851.468516382" watchObservedRunningTime="2025-10-08 13:26:41.666662594 +0000 UTC m=+1851.474967148" Oct 08 13:26:42 crc kubenswrapper[4735]: I1008 13:26:42.633276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerStarted","Data":"74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616"} Oct 08 13:26:42 crc kubenswrapper[4735]: I1008 13:26:42.659900 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-plw4g" podStartSLOduration=3.176070216 podStartE2EDuration="5.659883794s" podCreationTimestamp="2025-10-08 13:26:37 +0000 UTC" firstStartedPulling="2025-10-08 13:26:39.599623998 +0000 UTC m=+1849.407928542" lastFinishedPulling="2025-10-08 13:26:42.083437576 +0000 UTC m=+1851.891742120" observedRunningTime="2025-10-08 13:26:42.657531973 +0000 UTC m=+1852.465836517" watchObservedRunningTime="2025-10-08 13:26:42.659883794 +0000 UTC m=+1852.468188338" Oct 08 13:26:47 crc kubenswrapper[4735]: I1008 13:26:47.906638 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:26:47 crc kubenswrapper[4735]: E1008 13:26:47.907205 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.089515 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.089595 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.151400 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.279610 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.279682 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.326388 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.724135 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:48 crc kubenswrapper[4735]: I1008 13:26:48.724523 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:49 crc kubenswrapper[4735]: I1008 13:26:49.983182 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plw4g"] Oct 08 13:26:50 crc kubenswrapper[4735]: I1008 13:26:50.693522 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-plw4g" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="registry-server" containerID="cri-o://74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616" gracePeriod=2 Oct 08 13:26:50 crc kubenswrapper[4735]: I1008 13:26:50.980320 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z27b5"] Oct 08 13:26:50 crc kubenswrapper[4735]: I1008 13:26:50.981017 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z27b5" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="registry-server" containerID="cri-o://1bf465fb4c4fcbc6efa26bf34063d142634d521cdf78a33f2f66f26810b6db1c" gracePeriod=2 Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.065724 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.094558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-utilities\") pod \"b3046b2f-2ce4-473c-a76d-8931d1640182\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.094625 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68ht8\" (UniqueName: \"kubernetes.io/projected/b3046b2f-2ce4-473c-a76d-8931d1640182-kube-api-access-68ht8\") pod \"b3046b2f-2ce4-473c-a76d-8931d1640182\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.094665 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-catalog-content\") pod \"b3046b2f-2ce4-473c-a76d-8931d1640182\" (UID: \"b3046b2f-2ce4-473c-a76d-8931d1640182\") " Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.096432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-utilities" (OuterVolumeSpecName: "utilities") pod "b3046b2f-2ce4-473c-a76d-8931d1640182" (UID: "b3046b2f-2ce4-473c-a76d-8931d1640182"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.101126 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3046b2f-2ce4-473c-a76d-8931d1640182-kube-api-access-68ht8" (OuterVolumeSpecName: "kube-api-access-68ht8") pod "b3046b2f-2ce4-473c-a76d-8931d1640182" (UID: "b3046b2f-2ce4-473c-a76d-8931d1640182"). InnerVolumeSpecName "kube-api-access-68ht8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.196385 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.196418 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68ht8\" (UniqueName: \"kubernetes.io/projected/b3046b2f-2ce4-473c-a76d-8931d1640182-kube-api-access-68ht8\") on node \"crc\" DevicePath \"\"" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.617952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3046b2f-2ce4-473c-a76d-8931d1640182" (UID: "b3046b2f-2ce4-473c-a76d-8931d1640182"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.704203 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3046b2f-2ce4-473c-a76d-8931d1640182-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.705137 4735 generic.go:334] "Generic (PLEG): container finished" podID="82edd02f-5f52-4296-843b-edf563200302" containerID="1bf465fb4c4fcbc6efa26bf34063d142634d521cdf78a33f2f66f26810b6db1c" exitCode=0 Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.705153 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z27b5" event={"ID":"82edd02f-5f52-4296-843b-edf563200302","Type":"ContainerDied","Data":"1bf465fb4c4fcbc6efa26bf34063d142634d521cdf78a33f2f66f26810b6db1c"} Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.707105 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerID="74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616" exitCode=0 Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.707123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerDied","Data":"74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616"} Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.707137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plw4g" event={"ID":"b3046b2f-2ce4-473c-a76d-8931d1640182","Type":"ContainerDied","Data":"dfd4f60284fa961dd46d326fc900adba0656d3428d9366aadce44ee120120367"} Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.707152 4735 scope.go:117] "RemoveContainer" containerID="74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.707157 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plw4g" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.736784 4735 scope.go:117] "RemoveContainer" containerID="e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.737857 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plw4g"] Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.745684 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-plw4g"] Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.771696 4735 scope.go:117] "RemoveContainer" containerID="fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.806666 4735 scope.go:117] "RemoveContainer" containerID="74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616" Oct 08 13:26:51 crc kubenswrapper[4735]: E1008 13:26:51.807038 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616\": container with ID starting with 74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616 not found: ID does not exist" containerID="74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.807065 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616"} err="failed to get container status \"74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616\": rpc error: code = NotFound desc = could not find container \"74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616\": container with ID starting with 74193436ef1a773164170935729549bcb0d2ea8f2b2c7f515a8a39304e4d3616 not found: ID does not exist" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.807084 4735 scope.go:117] "RemoveContainer" containerID="e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a" Oct 08 13:26:51 crc kubenswrapper[4735]: E1008 13:26:51.807370 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a\": container with ID starting with e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a not found: ID does not exist" containerID="e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.807386 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a"} err="failed to get container status \"e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a\": rpc error: code = NotFound desc = could not find container \"e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a\": container with ID starting with e4ab73f7baf38469712266e9564c0f5222abcb1282fedb6f9bb897cf4c2f062a not found: ID does not exist" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.807399 4735 scope.go:117] "RemoveContainer" containerID="fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093" Oct 08 13:26:51 crc kubenswrapper[4735]: E1008 13:26:51.807848 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093\": container with ID starting with fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093 not found: ID does not exist" containerID="fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.807866 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093"} err="failed to get container status \"fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093\": rpc error: code = NotFound desc = could not find container \"fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093\": container with ID starting with fd8fef1ab974a4cf2103e129a9a50aef4bfd864871f032760e461cc85c3eb093 not found: ID does not exist" Oct 08 13:26:51 crc kubenswrapper[4735]: I1008 13:26:51.892423 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.007289 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-utilities\") pod \"82edd02f-5f52-4296-843b-edf563200302\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.007342 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-catalog-content\") pod \"82edd02f-5f52-4296-843b-edf563200302\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.008147 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dbqm\" (UniqueName: \"kubernetes.io/projected/82edd02f-5f52-4296-843b-edf563200302-kube-api-access-7dbqm\") pod \"82edd02f-5f52-4296-843b-edf563200302\" (UID: \"82edd02f-5f52-4296-843b-edf563200302\") " Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.008356 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-utilities" (OuterVolumeSpecName: "utilities") pod "82edd02f-5f52-4296-843b-edf563200302" (UID: "82edd02f-5f52-4296-843b-edf563200302"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.008735 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.012815 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82edd02f-5f52-4296-843b-edf563200302-kube-api-access-7dbqm" (OuterVolumeSpecName: "kube-api-access-7dbqm") pod "82edd02f-5f52-4296-843b-edf563200302" (UID: "82edd02f-5f52-4296-843b-edf563200302"). InnerVolumeSpecName "kube-api-access-7dbqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.021812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82edd02f-5f52-4296-843b-edf563200302" (UID: "82edd02f-5f52-4296-843b-edf563200302"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.109888 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82edd02f-5f52-4296-843b-edf563200302-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.109924 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dbqm\" (UniqueName: \"kubernetes.io/projected/82edd02f-5f52-4296-843b-edf563200302-kube-api-access-7dbqm\") on node \"crc\" DevicePath \"\"" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.716471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z27b5" event={"ID":"82edd02f-5f52-4296-843b-edf563200302","Type":"ContainerDied","Data":"2fe98f73fd971b5fbd992700521a08c9991888ee8e6c6a6f54fca09bd1dfe4dd"} Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.716543 4735 scope.go:117] "RemoveContainer" containerID="1bf465fb4c4fcbc6efa26bf34063d142634d521cdf78a33f2f66f26810b6db1c" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.716541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z27b5" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.732892 4735 scope.go:117] "RemoveContainer" containerID="006b5a96be254bb8ff4d5389f91629e0c7707198470de87da7eeaa559a3f0a4c" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.742572 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z27b5"] Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.749153 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z27b5"] Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.758450 4735 scope.go:117] "RemoveContainer" containerID="e37c18d1cd98a99965e517d7dd26fd6e603920df01698e08d8f480f7e3195d41" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.904193 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82edd02f-5f52-4296-843b-edf563200302" path="/var/lib/kubelet/pods/82edd02f-5f52-4296-843b-edf563200302/volumes" Oct 08 13:26:52 crc kubenswrapper[4735]: I1008 13:26:52.905127 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" path="/var/lib/kubelet/pods/b3046b2f-2ce4-473c-a76d-8931d1640182/volumes" Oct 08 13:27:02 crc kubenswrapper[4735]: I1008 13:27:02.895244 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:27:02 crc kubenswrapper[4735]: E1008 13:27:02.896010 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:27:17 crc kubenswrapper[4735]: I1008 13:27:17.894831 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:27:17 crc kubenswrapper[4735]: E1008 13:27:17.895455 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:27:31 crc kubenswrapper[4735]: I1008 13:27:31.894466 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:27:31 crc kubenswrapper[4735]: E1008 13:27:31.895121 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:27:43 crc kubenswrapper[4735]: I1008 13:27:43.895428 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:27:43 crc kubenswrapper[4735]: E1008 13:27:43.896311 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:27:57 crc kubenswrapper[4735]: I1008 13:27:57.895284 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:27:57 crc kubenswrapper[4735]: E1008 13:27:57.897057 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:28:10 crc kubenswrapper[4735]: I1008 13:28:10.899691 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:28:10 crc kubenswrapper[4735]: E1008 13:28:10.900363 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:28:24 crc kubenswrapper[4735]: I1008 13:28:24.895040 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:28:24 crc kubenswrapper[4735]: E1008 13:28:24.895749 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:28:38 crc kubenswrapper[4735]: I1008 13:28:38.894831 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:28:38 crc kubenswrapper[4735]: E1008 13:28:38.895658 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:28:50 crc kubenswrapper[4735]: I1008 13:28:50.902667 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:28:50 crc kubenswrapper[4735]: E1008 13:28:50.903936 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:29:03 crc kubenswrapper[4735]: I1008 13:29:03.895507 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:29:04 crc kubenswrapper[4735]: I1008 13:29:04.722611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"0d22801e37fea2d9c0b97338603164ed45026eabc3d3be49c34db8bc170436d5"} Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.146856 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7"] Oct 08 13:30:00 crc kubenswrapper[4735]: E1008 13:30:00.147834 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="registry-server" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.147853 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="registry-server" Oct 08 13:30:00 crc kubenswrapper[4735]: E1008 13:30:00.147869 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="registry-server" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.147876 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="registry-server" Oct 08 13:30:00 crc kubenswrapper[4735]: E1008 13:30:00.147897 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="extract-content" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.147906 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="extract-content" Oct 08 13:30:00 crc kubenswrapper[4735]: E1008 13:30:00.147918 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="extract-utilities" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.147927 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="extract-utilities" Oct 08 13:30:00 crc kubenswrapper[4735]: E1008 13:30:00.147948 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="extract-content" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.147958 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="extract-content" Oct 08 13:30:00 crc kubenswrapper[4735]: E1008 13:30:00.147974 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="extract-utilities" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.147981 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="extract-utilities" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.148152 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="82edd02f-5f52-4296-843b-edf563200302" containerName="registry-server" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.148186 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3046b2f-2ce4-473c-a76d-8931d1640182" containerName="registry-server" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.148864 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.151866 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.152090 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.156553 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7"] Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.271587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/339b07cc-0402-457f-bb34-54b687ecab27-config-volume\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.271685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bzh\" (UniqueName: \"kubernetes.io/projected/339b07cc-0402-457f-bb34-54b687ecab27-kube-api-access-d8bzh\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.271723 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/339b07cc-0402-457f-bb34-54b687ecab27-secret-volume\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.373525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/339b07cc-0402-457f-bb34-54b687ecab27-config-volume\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.373617 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bzh\" (UniqueName: \"kubernetes.io/projected/339b07cc-0402-457f-bb34-54b687ecab27-kube-api-access-d8bzh\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.373655 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/339b07cc-0402-457f-bb34-54b687ecab27-secret-volume\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.374992 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/339b07cc-0402-457f-bb34-54b687ecab27-config-volume\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.383387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/339b07cc-0402-457f-bb34-54b687ecab27-secret-volume\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.390800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bzh\" (UniqueName: \"kubernetes.io/projected/339b07cc-0402-457f-bb34-54b687ecab27-kube-api-access-d8bzh\") pod \"collect-profiles-29332170-gh5m7\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.481221 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:00 crc kubenswrapper[4735]: I1008 13:30:00.904542 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7"] Oct 08 13:30:01 crc kubenswrapper[4735]: I1008 13:30:01.158286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" event={"ID":"339b07cc-0402-457f-bb34-54b687ecab27","Type":"ContainerStarted","Data":"d1017f850ef74255e647c8fe7b81efd3452772d20c91d0d1577ab0b8a45967b5"} Oct 08 13:30:01 crc kubenswrapper[4735]: I1008 13:30:01.158635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" event={"ID":"339b07cc-0402-457f-bb34-54b687ecab27","Type":"ContainerStarted","Data":"b692e13bf5c7f8e388d2175e874fa14569d41c6df6defa5eb7739089417c52e7"} Oct 08 13:30:01 crc kubenswrapper[4735]: I1008 13:30:01.175945 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" podStartSLOduration=1.175927113 podStartE2EDuration="1.175927113s" podCreationTimestamp="2025-10-08 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 13:30:01.171248843 +0000 UTC m=+2050.979553387" watchObservedRunningTime="2025-10-08 13:30:01.175927113 +0000 UTC m=+2050.984231657" Oct 08 13:30:02 crc kubenswrapper[4735]: I1008 13:30:02.169033 4735 generic.go:334] "Generic (PLEG): container finished" podID="339b07cc-0402-457f-bb34-54b687ecab27" containerID="d1017f850ef74255e647c8fe7b81efd3452772d20c91d0d1577ab0b8a45967b5" exitCode=0 Oct 08 13:30:02 crc kubenswrapper[4735]: I1008 13:30:02.169090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" event={"ID":"339b07cc-0402-457f-bb34-54b687ecab27","Type":"ContainerDied","Data":"d1017f850ef74255e647c8fe7b81efd3452772d20c91d0d1577ab0b8a45967b5"} Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.477949 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.626224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/339b07cc-0402-457f-bb34-54b687ecab27-secret-volume\") pod \"339b07cc-0402-457f-bb34-54b687ecab27\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.626316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/339b07cc-0402-457f-bb34-54b687ecab27-config-volume\") pod \"339b07cc-0402-457f-bb34-54b687ecab27\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.626354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8bzh\" (UniqueName: \"kubernetes.io/projected/339b07cc-0402-457f-bb34-54b687ecab27-kube-api-access-d8bzh\") pod \"339b07cc-0402-457f-bb34-54b687ecab27\" (UID: \"339b07cc-0402-457f-bb34-54b687ecab27\") " Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.627239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/339b07cc-0402-457f-bb34-54b687ecab27-config-volume" (OuterVolumeSpecName: "config-volume") pod "339b07cc-0402-457f-bb34-54b687ecab27" (UID: "339b07cc-0402-457f-bb34-54b687ecab27"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.631651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339b07cc-0402-457f-bb34-54b687ecab27-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "339b07cc-0402-457f-bb34-54b687ecab27" (UID: "339b07cc-0402-457f-bb34-54b687ecab27"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.634680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339b07cc-0402-457f-bb34-54b687ecab27-kube-api-access-d8bzh" (OuterVolumeSpecName: "kube-api-access-d8bzh") pod "339b07cc-0402-457f-bb34-54b687ecab27" (UID: "339b07cc-0402-457f-bb34-54b687ecab27"). InnerVolumeSpecName "kube-api-access-d8bzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.727697 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/339b07cc-0402-457f-bb34-54b687ecab27-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.728038 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/339b07cc-0402-457f-bb34-54b687ecab27-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:30:03 crc kubenswrapper[4735]: I1008 13:30:03.728055 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8bzh\" (UniqueName: \"kubernetes.io/projected/339b07cc-0402-457f-bb34-54b687ecab27-kube-api-access-d8bzh\") on node \"crc\" DevicePath \"\"" Oct 08 13:30:04 crc kubenswrapper[4735]: I1008 13:30:04.186927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" event={"ID":"339b07cc-0402-457f-bb34-54b687ecab27","Type":"ContainerDied","Data":"b692e13bf5c7f8e388d2175e874fa14569d41c6df6defa5eb7739089417c52e7"} Oct 08 13:30:04 crc kubenswrapper[4735]: I1008 13:30:04.186965 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b692e13bf5c7f8e388d2175e874fa14569d41c6df6defa5eb7739089417c52e7" Oct 08 13:30:04 crc kubenswrapper[4735]: I1008 13:30:04.186994 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7" Oct 08 13:30:04 crc kubenswrapper[4735]: I1008 13:30:04.239918 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9"] Oct 08 13:30:04 crc kubenswrapper[4735]: I1008 13:30:04.245981 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332125-6dxj9"] Oct 08 13:30:04 crc kubenswrapper[4735]: I1008 13:30:04.907430 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93cce0d5-b158-4619-95f3-eaa50b8d9af4" path="/var/lib/kubelet/pods/93cce0d5-b158-4619-95f3-eaa50b8d9af4/volumes" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.251806 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5gh5p"] Oct 08 13:30:05 crc kubenswrapper[4735]: E1008 13:30:05.252574 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339b07cc-0402-457f-bb34-54b687ecab27" containerName="collect-profiles" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.252608 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="339b07cc-0402-457f-bb34-54b687ecab27" containerName="collect-profiles" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.252849 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="339b07cc-0402-457f-bb34-54b687ecab27" containerName="collect-profiles" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.254368 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.267210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5gh5p"] Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.353723 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dv7\" (UniqueName: \"kubernetes.io/projected/2b6c2094-5577-4091-94e2-19d9524db4de-kube-api-access-58dv7\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.353778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-catalog-content\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.353815 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-utilities\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.454764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dv7\" (UniqueName: \"kubernetes.io/projected/2b6c2094-5577-4091-94e2-19d9524db4de-kube-api-access-58dv7\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.454826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-catalog-content\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.454868 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-utilities\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.455535 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-utilities\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.455687 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-catalog-content\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.477439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dv7\" (UniqueName: \"kubernetes.io/projected/2b6c2094-5577-4091-94e2-19d9524db4de-kube-api-access-58dv7\") pod \"redhat-operators-5gh5p\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.573486 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:05 crc kubenswrapper[4735]: I1008 13:30:05.834829 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5gh5p"] Oct 08 13:30:06 crc kubenswrapper[4735]: I1008 13:30:06.202867 4735 generic.go:334] "Generic (PLEG): container finished" podID="2b6c2094-5577-4091-94e2-19d9524db4de" containerID="a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5" exitCode=0 Oct 08 13:30:06 crc kubenswrapper[4735]: I1008 13:30:06.202915 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerDied","Data":"a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5"} Oct 08 13:30:06 crc kubenswrapper[4735]: I1008 13:30:06.202939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerStarted","Data":"ed5abc7577be6ae6952419bcf26b4063f7fc3550efc1deeb38ddea53045d8072"} Oct 08 13:30:06 crc kubenswrapper[4735]: I1008 13:30:06.573240 4735 scope.go:117] "RemoveContainer" containerID="5389c3006c64b00e2fd3278755a075217ee591a2311848de5bb48a47583d90b7" Oct 08 13:30:08 crc kubenswrapper[4735]: I1008 13:30:08.217806 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerStarted","Data":"842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c"} Oct 08 13:30:09 crc kubenswrapper[4735]: I1008 13:30:09.230720 4735 generic.go:334] "Generic (PLEG): container finished" podID="2b6c2094-5577-4091-94e2-19d9524db4de" containerID="842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c" exitCode=0 Oct 08 13:30:09 crc kubenswrapper[4735]: I1008 13:30:09.230781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerDied","Data":"842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c"} Oct 08 13:30:10 crc kubenswrapper[4735]: I1008 13:30:10.240870 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerStarted","Data":"c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515"} Oct 08 13:30:10 crc kubenswrapper[4735]: I1008 13:30:10.261895 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5gh5p" podStartSLOduration=1.560392 podStartE2EDuration="5.261872491s" podCreationTimestamp="2025-10-08 13:30:05 +0000 UTC" firstStartedPulling="2025-10-08 13:30:06.204626883 +0000 UTC m=+2056.012931417" lastFinishedPulling="2025-10-08 13:30:09.906107364 +0000 UTC m=+2059.714411908" observedRunningTime="2025-10-08 13:30:10.261015049 +0000 UTC m=+2060.069319613" watchObservedRunningTime="2025-10-08 13:30:10.261872491 +0000 UTC m=+2060.070177035" Oct 08 13:30:15 crc kubenswrapper[4735]: I1008 13:30:15.573927 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:15 crc kubenswrapper[4735]: I1008 13:30:15.574498 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:15 crc kubenswrapper[4735]: I1008 13:30:15.616015 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:16 crc kubenswrapper[4735]: I1008 13:30:16.323687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:16 crc kubenswrapper[4735]: I1008 13:30:16.371125 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5gh5p"] Oct 08 13:30:18 crc kubenswrapper[4735]: I1008 13:30:18.296384 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5gh5p" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="registry-server" containerID="cri-o://c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515" gracePeriod=2 Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.799766 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.856677 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58dv7\" (UniqueName: \"kubernetes.io/projected/2b6c2094-5577-4091-94e2-19d9524db4de-kube-api-access-58dv7\") pod \"2b6c2094-5577-4091-94e2-19d9524db4de\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.856797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-utilities\") pod \"2b6c2094-5577-4091-94e2-19d9524db4de\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.856845 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-catalog-content\") pod \"2b6c2094-5577-4091-94e2-19d9524db4de\" (UID: \"2b6c2094-5577-4091-94e2-19d9524db4de\") " Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.857663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-utilities" (OuterVolumeSpecName: "utilities") pod "2b6c2094-5577-4091-94e2-19d9524db4de" (UID: "2b6c2094-5577-4091-94e2-19d9524db4de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.868600 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6c2094-5577-4091-94e2-19d9524db4de-kube-api-access-58dv7" (OuterVolumeSpecName: "kube-api-access-58dv7") pod "2b6c2094-5577-4091-94e2-19d9524db4de" (UID: "2b6c2094-5577-4091-94e2-19d9524db4de"). InnerVolumeSpecName "kube-api-access-58dv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.944013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b6c2094-5577-4091-94e2-19d9524db4de" (UID: "2b6c2094-5577-4091-94e2-19d9524db4de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.958736 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58dv7\" (UniqueName: \"kubernetes.io/projected/2b6c2094-5577-4091-94e2-19d9524db4de-kube-api-access-58dv7\") on node \"crc\" DevicePath \"\"" Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.958773 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:30:19 crc kubenswrapper[4735]: I1008 13:30:19.958785 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6c2094-5577-4091-94e2-19d9524db4de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.312289 4735 generic.go:334] "Generic (PLEG): container finished" podID="2b6c2094-5577-4091-94e2-19d9524db4de" containerID="c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515" exitCode=0 Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.312389 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerDied","Data":"c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515"} Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.312435 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5gh5p" event={"ID":"2b6c2094-5577-4091-94e2-19d9524db4de","Type":"ContainerDied","Data":"ed5abc7577be6ae6952419bcf26b4063f7fc3550efc1deeb38ddea53045d8072"} Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.312461 4735 scope.go:117] "RemoveContainer" containerID="c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.312665 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5gh5p" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.355397 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5gh5p"] Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.361193 4735 scope.go:117] "RemoveContainer" containerID="842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.362990 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5gh5p"] Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.378383 4735 scope.go:117] "RemoveContainer" containerID="a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.411278 4735 scope.go:117] "RemoveContainer" containerID="c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515" Oct 08 13:30:20 crc kubenswrapper[4735]: E1008 13:30:20.412030 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515\": container with ID starting with c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515 not found: ID does not exist" containerID="c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.412067 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515"} err="failed to get container status \"c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515\": rpc error: code = NotFound desc = could not find container \"c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515\": container with ID starting with c38a38b60668a8a8c6b3d83666eade291dcdd3d77a6860f70643498def1fe515 not found: ID does not exist" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.412091 4735 scope.go:117] "RemoveContainer" containerID="842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c" Oct 08 13:30:20 crc kubenswrapper[4735]: E1008 13:30:20.412394 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c\": container with ID starting with 842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c not found: ID does not exist" containerID="842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.412423 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c"} err="failed to get container status \"842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c\": rpc error: code = NotFound desc = could not find container \"842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c\": container with ID starting with 842677a536786ff59b36484e765e9fe246fa1a65e7bc5a49058e8b30b902393c not found: ID does not exist" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.412443 4735 scope.go:117] "RemoveContainer" containerID="a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5" Oct 08 13:30:20 crc kubenswrapper[4735]: E1008 13:30:20.412815 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5\": container with ID starting with a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5 not found: ID does not exist" containerID="a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.412839 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5"} err="failed to get container status \"a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5\": rpc error: code = NotFound desc = could not find container \"a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5\": container with ID starting with a5c63b902c611bc0210e2e424e37270039a623ab64f71e4b1095bab8431fe8d5 not found: ID does not exist" Oct 08 13:30:20 crc kubenswrapper[4735]: I1008 13:30:20.905969 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" path="/var/lib/kubelet/pods/2b6c2094-5577-4091-94e2-19d9524db4de/volumes" Oct 08 13:31:24 crc kubenswrapper[4735]: I1008 13:31:24.878544 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:31:24 crc kubenswrapper[4735]: I1008 13:31:24.879112 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.961676 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 13:31:46 crc kubenswrapper[4735]: E1008 13:31:46.962463 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="extract-utilities" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.962493 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="extract-utilities" Oct 08 13:31:46 crc kubenswrapper[4735]: E1008 13:31:46.962504 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="registry-server" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.962512 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="registry-server" Oct 08 13:31:46 crc kubenswrapper[4735]: E1008 13:31:46.962527 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="extract-content" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.962533 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="extract-content" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.962687 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6c2094-5577-4091-94e2-19d9524db4de" containerName="registry-server" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.967162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:46 crc kubenswrapper[4735]: I1008 13:31:46.973114 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.051707 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxqp4\" (UniqueName: \"kubernetes.io/projected/8d70ee3a-74c8-44e6-99d1-b558d7315963-kube-api-access-fxqp4\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.051765 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-utilities\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.051877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-catalog-content\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.152442 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxqp4\" (UniqueName: \"kubernetes.io/projected/8d70ee3a-74c8-44e6-99d1-b558d7315963-kube-api-access-fxqp4\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.152565 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-utilities\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.152626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-catalog-content\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.153058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-catalog-content\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.153100 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-utilities\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.174768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxqp4\" (UniqueName: \"kubernetes.io/projected/8d70ee3a-74c8-44e6-99d1-b558d7315963-kube-api-access-fxqp4\") pod \"certified-operators-rrq6x\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.297678 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.774787 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 13:31:47 crc kubenswrapper[4735]: I1008 13:31:47.969770 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrq6x" event={"ID":"8d70ee3a-74c8-44e6-99d1-b558d7315963","Type":"ContainerStarted","Data":"5505ddf88d9cac821f93d8cde38cfedcf3da0dd1d3f1d8555cb8cea54fe6d89f"} Oct 08 13:31:48 crc kubenswrapper[4735]: I1008 13:31:48.981956 4735 generic.go:334] "Generic (PLEG): container finished" podID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerID="2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4" exitCode=0 Oct 08 13:31:48 crc kubenswrapper[4735]: I1008 13:31:48.981997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrq6x" event={"ID":"8d70ee3a-74c8-44e6-99d1-b558d7315963","Type":"ContainerDied","Data":"2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4"} Oct 08 13:31:48 crc kubenswrapper[4735]: I1008 13:31:48.985458 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:31:54 crc kubenswrapper[4735]: I1008 13:31:54.021310 4735 generic.go:334] "Generic (PLEG): container finished" podID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerID="5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95" exitCode=0 Oct 08 13:31:54 crc kubenswrapper[4735]: I1008 13:31:54.021434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrq6x" event={"ID":"8d70ee3a-74c8-44e6-99d1-b558d7315963","Type":"ContainerDied","Data":"5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95"} Oct 08 13:31:54 crc kubenswrapper[4735]: I1008 13:31:54.878943 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:31:54 crc kubenswrapper[4735]: I1008 13:31:54.879037 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:31:55 crc kubenswrapper[4735]: I1008 13:31:55.035419 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrq6x" event={"ID":"8d70ee3a-74c8-44e6-99d1-b558d7315963","Type":"ContainerStarted","Data":"773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205"} Oct 08 13:31:55 crc kubenswrapper[4735]: I1008 13:31:55.062411 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rrq6x" podStartSLOduration=3.620753477 podStartE2EDuration="9.062392563s" podCreationTimestamp="2025-10-08 13:31:46 +0000 UTC" firstStartedPulling="2025-10-08 13:31:48.98518826 +0000 UTC m=+2158.793492804" lastFinishedPulling="2025-10-08 13:31:54.426827346 +0000 UTC m=+2164.235131890" observedRunningTime="2025-10-08 13:31:55.053170777 +0000 UTC m=+2164.861475341" watchObservedRunningTime="2025-10-08 13:31:55.062392563 +0000 UTC m=+2164.870697107" Oct 08 13:31:57 crc kubenswrapper[4735]: I1008 13:31:57.298100 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:57 crc kubenswrapper[4735]: I1008 13:31:57.298381 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:31:57 crc kubenswrapper[4735]: I1008 13:31:57.343348 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:32:07 crc kubenswrapper[4735]: I1008 13:32:07.363008 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 13:32:07 crc kubenswrapper[4735]: I1008 13:32:07.435900 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 13:32:07 crc kubenswrapper[4735]: I1008 13:32:07.471804 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:32:07 crc kubenswrapper[4735]: I1008 13:32:07.472124 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-48k4v" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="registry-server" containerID="cri-o://79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a" gracePeriod=2 Oct 08 13:32:07 crc kubenswrapper[4735]: I1008 13:32:07.871980 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.067885 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-catalog-content\") pod \"96042caa-f4e2-4a05-af44-8883a06394de\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.067976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-utilities\") pod \"96042caa-f4e2-4a05-af44-8883a06394de\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.068071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpx2t\" (UniqueName: \"kubernetes.io/projected/96042caa-f4e2-4a05-af44-8883a06394de-kube-api-access-mpx2t\") pod \"96042caa-f4e2-4a05-af44-8883a06394de\" (UID: \"96042caa-f4e2-4a05-af44-8883a06394de\") " Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.068693 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-utilities" (OuterVolumeSpecName: "utilities") pod "96042caa-f4e2-4a05-af44-8883a06394de" (UID: "96042caa-f4e2-4a05-af44-8883a06394de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.083240 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96042caa-f4e2-4a05-af44-8883a06394de-kube-api-access-mpx2t" (OuterVolumeSpecName: "kube-api-access-mpx2t") pod "96042caa-f4e2-4a05-af44-8883a06394de" (UID: "96042caa-f4e2-4a05-af44-8883a06394de"). InnerVolumeSpecName "kube-api-access-mpx2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.129809 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96042caa-f4e2-4a05-af44-8883a06394de" (UID: "96042caa-f4e2-4a05-af44-8883a06394de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.149708 4735 generic.go:334] "Generic (PLEG): container finished" podID="96042caa-f4e2-4a05-af44-8883a06394de" containerID="79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a" exitCode=0 Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.149790 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48k4v" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.150011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48k4v" event={"ID":"96042caa-f4e2-4a05-af44-8883a06394de","Type":"ContainerDied","Data":"79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a"} Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.150383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48k4v" event={"ID":"96042caa-f4e2-4a05-af44-8883a06394de","Type":"ContainerDied","Data":"513d3f5f811d1f23b5625c987b7069132080d95f8d260cc3d04ba9f60c3f7a03"} Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.150583 4735 scope.go:117] "RemoveContainer" containerID="79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.171137 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpx2t\" (UniqueName: \"kubernetes.io/projected/96042caa-f4e2-4a05-af44-8883a06394de-kube-api-access-mpx2t\") on node \"crc\" DevicePath \"\"" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.171163 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.171176 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96042caa-f4e2-4a05-af44-8883a06394de-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.174450 4735 scope.go:117] "RemoveContainer" containerID="705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.188943 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.199256 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-48k4v"] Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.212257 4735 scope.go:117] "RemoveContainer" containerID="20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.232589 4735 scope.go:117] "RemoveContainer" containerID="79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a" Oct 08 13:32:08 crc kubenswrapper[4735]: E1008 13:32:08.232889 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a\": container with ID starting with 79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a not found: ID does not exist" containerID="79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.232926 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a"} err="failed to get container status \"79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a\": rpc error: code = NotFound desc = could not find container \"79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a\": container with ID starting with 79d58ad9d943eda3cee489443e428d1da451a99b820b7161f642b42d0772d78a not found: ID does not exist" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.232953 4735 scope.go:117] "RemoveContainer" containerID="705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c" Oct 08 13:32:08 crc kubenswrapper[4735]: E1008 13:32:08.233188 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c\": container with ID starting with 705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c not found: ID does not exist" containerID="705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.233215 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c"} err="failed to get container status \"705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c\": rpc error: code = NotFound desc = could not find container \"705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c\": container with ID starting with 705e78c208e70e10b473ad5e457818244cf518cc5d5c39b00b8049cadc94890c not found: ID does not exist" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.233234 4735 scope.go:117] "RemoveContainer" containerID="20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5" Oct 08 13:32:08 crc kubenswrapper[4735]: E1008 13:32:08.233552 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5\": container with ID starting with 20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5 not found: ID does not exist" containerID="20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.233574 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5"} err="failed to get container status \"20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5\": rpc error: code = NotFound desc = could not find container \"20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5\": container with ID starting with 20296ca5143a0bce79c8e46e0e920bbdecee7bbe4ac1065aaf38b8cd75ca7be5 not found: ID does not exist" Oct 08 13:32:08 crc kubenswrapper[4735]: I1008 13:32:08.905600 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96042caa-f4e2-4a05-af44-8883a06394de" path="/var/lib/kubelet/pods/96042caa-f4e2-4a05-af44-8883a06394de/volumes" Oct 08 13:32:24 crc kubenswrapper[4735]: I1008 13:32:24.878366 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:32:24 crc kubenswrapper[4735]: I1008 13:32:24.879187 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:32:24 crc kubenswrapper[4735]: I1008 13:32:24.879254 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:32:24 crc kubenswrapper[4735]: I1008 13:32:24.880332 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d22801e37fea2d9c0b97338603164ed45026eabc3d3be49c34db8bc170436d5"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:32:24 crc kubenswrapper[4735]: I1008 13:32:24.880460 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://0d22801e37fea2d9c0b97338603164ed45026eabc3d3be49c34db8bc170436d5" gracePeriod=600 Oct 08 13:32:25 crc kubenswrapper[4735]: I1008 13:32:25.290301 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="0d22801e37fea2d9c0b97338603164ed45026eabc3d3be49c34db8bc170436d5" exitCode=0 Oct 08 13:32:25 crc kubenswrapper[4735]: I1008 13:32:25.290778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"0d22801e37fea2d9c0b97338603164ed45026eabc3d3be49c34db8bc170436d5"} Oct 08 13:32:25 crc kubenswrapper[4735]: I1008 13:32:25.290831 4735 scope.go:117] "RemoveContainer" containerID="e875648415936f7b8301549fb9b938d29cca1f9c87a5286e73a78506e797f56a" Oct 08 13:32:26 crc kubenswrapper[4735]: I1008 13:32:26.299876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2"} Oct 08 13:34:54 crc kubenswrapper[4735]: I1008 13:34:54.878518 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:34:54 crc kubenswrapper[4735]: I1008 13:34:54.879064 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:35:24 crc kubenswrapper[4735]: I1008 13:35:24.878042 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:35:24 crc kubenswrapper[4735]: I1008 13:35:24.878647 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:35:54 crc kubenswrapper[4735]: I1008 13:35:54.878996 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:35:54 crc kubenswrapper[4735]: I1008 13:35:54.879629 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:35:54 crc kubenswrapper[4735]: I1008 13:35:54.879680 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:35:54 crc kubenswrapper[4735]: I1008 13:35:54.880338 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:35:54 crc kubenswrapper[4735]: I1008 13:35:54.880487 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" gracePeriod=600 Oct 08 13:35:55 crc kubenswrapper[4735]: E1008 13:35:55.005621 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:35:55 crc kubenswrapper[4735]: I1008 13:35:55.846751 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" exitCode=0 Oct 08 13:35:55 crc kubenswrapper[4735]: I1008 13:35:55.846795 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2"} Oct 08 13:35:55 crc kubenswrapper[4735]: I1008 13:35:55.846826 4735 scope.go:117] "RemoveContainer" containerID="0d22801e37fea2d9c0b97338603164ed45026eabc3d3be49c34db8bc170436d5" Oct 08 13:35:55 crc kubenswrapper[4735]: I1008 13:35:55.847469 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:35:55 crc kubenswrapper[4735]: E1008 13:35:55.847860 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:36:06 crc kubenswrapper[4735]: I1008 13:36:06.894642 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:36:06 crc kubenswrapper[4735]: E1008 13:36:06.896127 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:36:20 crc kubenswrapper[4735]: I1008 13:36:20.899815 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:36:20 crc kubenswrapper[4735]: E1008 13:36:20.900714 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:36:34 crc kubenswrapper[4735]: I1008 13:36:34.895040 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:36:34 crc kubenswrapper[4735]: E1008 13:36:34.896067 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.186452 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5tjz"] Oct 08 13:36:39 crc kubenswrapper[4735]: E1008 13:36:39.187037 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="extract-content" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.187051 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="extract-content" Oct 08 13:36:39 crc kubenswrapper[4735]: E1008 13:36:39.187074 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="extract-utilities" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.187081 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="extract-utilities" Oct 08 13:36:39 crc kubenswrapper[4735]: E1008 13:36:39.187097 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="registry-server" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.187103 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="registry-server" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.187242 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="96042caa-f4e2-4a05-af44-8883a06394de" containerName="registry-server" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.188275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.208537 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5tjz"] Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.371087 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-catalog-content\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.371155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qq27\" (UniqueName: \"kubernetes.io/projected/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-kube-api-access-9qq27\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.371174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-utilities\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.471918 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-catalog-content\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.471989 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qq27\" (UniqueName: \"kubernetes.io/projected/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-kube-api-access-9qq27\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.472014 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-utilities\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.472669 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-utilities\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.472691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-catalog-content\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.495133 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qq27\" (UniqueName: \"kubernetes.io/projected/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-kube-api-access-9qq27\") pod \"redhat-marketplace-m5tjz\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.513777 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:39 crc kubenswrapper[4735]: I1008 13:36:39.940041 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5tjz"] Oct 08 13:36:40 crc kubenswrapper[4735]: I1008 13:36:40.190303 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerID="e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4" exitCode=0 Oct 08 13:36:40 crc kubenswrapper[4735]: I1008 13:36:40.190347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerDied","Data":"e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4"} Oct 08 13:36:40 crc kubenswrapper[4735]: I1008 13:36:40.190373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerStarted","Data":"5208ce898b4c4263fa2159b76cd3eb262258d7992e00ea297c31b380cee06948"} Oct 08 13:36:41 crc kubenswrapper[4735]: I1008 13:36:41.196290 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerStarted","Data":"9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a"} Oct 08 13:36:42 crc kubenswrapper[4735]: I1008 13:36:42.205219 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerID="9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a" exitCode=0 Oct 08 13:36:42 crc kubenswrapper[4735]: I1008 13:36:42.205470 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerDied","Data":"9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a"} Oct 08 13:36:43 crc kubenswrapper[4735]: I1008 13:36:43.213401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerStarted","Data":"192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416"} Oct 08 13:36:43 crc kubenswrapper[4735]: I1008 13:36:43.232405 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5tjz" podStartSLOduration=1.7783333620000001 podStartE2EDuration="4.232385747s" podCreationTimestamp="2025-10-08 13:36:39 +0000 UTC" firstStartedPulling="2025-10-08 13:36:40.19223515 +0000 UTC m=+2450.000539704" lastFinishedPulling="2025-10-08 13:36:42.646287535 +0000 UTC m=+2452.454592089" observedRunningTime="2025-10-08 13:36:43.227696486 +0000 UTC m=+2453.036001050" watchObservedRunningTime="2025-10-08 13:36:43.232385747 +0000 UTC m=+2453.040690291" Oct 08 13:36:47 crc kubenswrapper[4735]: I1008 13:36:47.895841 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:36:47 crc kubenswrapper[4735]: E1008 13:36:47.897303 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:36:49 crc kubenswrapper[4735]: I1008 13:36:49.513877 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:49 crc kubenswrapper[4735]: I1008 13:36:49.514164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:49 crc kubenswrapper[4735]: I1008 13:36:49.560518 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:50 crc kubenswrapper[4735]: I1008 13:36:50.299061 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:50 crc kubenswrapper[4735]: I1008 13:36:50.359661 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5tjz"] Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.265640 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m5tjz" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="registry-server" containerID="cri-o://192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416" gracePeriod=2 Oct 08 13:36:52 crc kubenswrapper[4735]: E1008 13:36:52.337377 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7ac83b6_6129_4bf5_ad9f_c8f879edc7ec.slice/crio-conmon-192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416.scope\": RecentStats: unable to find data in memory cache]" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.657852 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.776605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-utilities\") pod \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.776729 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-catalog-content\") pod \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.776829 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qq27\" (UniqueName: \"kubernetes.io/projected/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-kube-api-access-9qq27\") pod \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\" (UID: \"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec\") " Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.778436 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-utilities" (OuterVolumeSpecName: "utilities") pod "e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" (UID: "e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.784686 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-kube-api-access-9qq27" (OuterVolumeSpecName: "kube-api-access-9qq27") pod "e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" (UID: "e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec"). InnerVolumeSpecName "kube-api-access-9qq27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.792502 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" (UID: "e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.878855 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qq27\" (UniqueName: \"kubernetes.io/projected/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-kube-api-access-9qq27\") on node \"crc\" DevicePath \"\"" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.878929 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:36:52 crc kubenswrapper[4735]: I1008 13:36:52.878940 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.278316 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerID="192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416" exitCode=0 Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.278401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerDied","Data":"192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416"} Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.278452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5tjz" event={"ID":"e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec","Type":"ContainerDied","Data":"5208ce898b4c4263fa2159b76cd3eb262258d7992e00ea297c31b380cee06948"} Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.278561 4735 scope.go:117] "RemoveContainer" containerID="192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.278705 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5tjz" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.302595 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5tjz"] Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.305023 4735 scope.go:117] "RemoveContainer" containerID="9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.308229 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5tjz"] Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.324730 4735 scope.go:117] "RemoveContainer" containerID="e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.346249 4735 scope.go:117] "RemoveContainer" containerID="192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416" Oct 08 13:36:53 crc kubenswrapper[4735]: E1008 13:36:53.346817 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416\": container with ID starting with 192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416 not found: ID does not exist" containerID="192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.346866 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416"} err="failed to get container status \"192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416\": rpc error: code = NotFound desc = could not find container \"192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416\": container with ID starting with 192fd100efa4061f0a2fdaebcd662490d4d645a47cc7468b6ced50573d61e416 not found: ID does not exist" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.346899 4735 scope.go:117] "RemoveContainer" containerID="9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a" Oct 08 13:36:53 crc kubenswrapper[4735]: E1008 13:36:53.347192 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a\": container with ID starting with 9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a not found: ID does not exist" containerID="9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.347231 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a"} err="failed to get container status \"9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a\": rpc error: code = NotFound desc = could not find container \"9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a\": container with ID starting with 9b5df24ee64572c71e4b35755c7efc143cc53a03d78933add99f7e4a2e75ff8a not found: ID does not exist" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.347254 4735 scope.go:117] "RemoveContainer" containerID="e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4" Oct 08 13:36:53 crc kubenswrapper[4735]: E1008 13:36:53.347612 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4\": container with ID starting with e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4 not found: ID does not exist" containerID="e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4" Oct 08 13:36:53 crc kubenswrapper[4735]: I1008 13:36:53.347650 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4"} err="failed to get container status \"e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4\": rpc error: code = NotFound desc = could not find container \"e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4\": container with ID starting with e47d49f465e5ac99d0fb4d8ef6933f2aa6d784fe057a8a2f0ae9ba24221458b4 not found: ID does not exist" Oct 08 13:36:54 crc kubenswrapper[4735]: I1008 13:36:54.905382 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" path="/var/lib/kubelet/pods/e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec/volumes" Oct 08 13:36:58 crc kubenswrapper[4735]: I1008 13:36:58.895738 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:36:58 crc kubenswrapper[4735]: E1008 13:36:58.897526 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:37:09 crc kubenswrapper[4735]: I1008 13:37:09.894892 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:37:09 crc kubenswrapper[4735]: E1008 13:37:09.895678 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:37:22 crc kubenswrapper[4735]: I1008 13:37:22.895072 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:37:22 crc kubenswrapper[4735]: E1008 13:37:22.895911 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:37:36 crc kubenswrapper[4735]: I1008 13:37:36.894789 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:37:36 crc kubenswrapper[4735]: E1008 13:37:36.895694 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:37:50 crc kubenswrapper[4735]: I1008 13:37:50.898461 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:37:50 crc kubenswrapper[4735]: E1008 13:37:50.899397 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.900367 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mvxv2"] Oct 08 13:38:03 crc kubenswrapper[4735]: E1008 13:38:03.901430 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="extract-utilities" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.901447 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="extract-utilities" Oct 08 13:38:03 crc kubenswrapper[4735]: E1008 13:38:03.901505 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="extract-content" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.901518 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="extract-content" Oct 08 13:38:03 crc kubenswrapper[4735]: E1008 13:38:03.901536 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="registry-server" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.901545 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="registry-server" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.901728 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ac83b6-6129-4bf5-ad9f-c8f879edc7ec" containerName="registry-server" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.903087 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:03 crc kubenswrapper[4735]: I1008 13:38:03.909257 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvxv2"] Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.028661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-catalog-content\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.028714 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-utilities\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.028884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8xgc\" (UniqueName: \"kubernetes.io/projected/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-kube-api-access-q8xgc\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.130056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xgc\" (UniqueName: \"kubernetes.io/projected/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-kube-api-access-q8xgc\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.130113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-catalog-content\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.130145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-utilities\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.130651 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-catalog-content\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.130780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-utilities\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.149519 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8xgc\" (UniqueName: \"kubernetes.io/projected/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-kube-api-access-q8xgc\") pod \"community-operators-mvxv2\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.234142 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.524813 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvxv2"] Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.813103 4735 generic.go:334] "Generic (PLEG): container finished" podID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerID="70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7" exitCode=0 Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.813154 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerDied","Data":"70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7"} Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.813194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerStarted","Data":"e0e6bf2047db7959f4a378cd4f597e07cd960022db163ddb96b435736b65b741"} Oct 08 13:38:04 crc kubenswrapper[4735]: I1008 13:38:04.814630 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:38:05 crc kubenswrapper[4735]: I1008 13:38:05.820738 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerStarted","Data":"021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a"} Oct 08 13:38:05 crc kubenswrapper[4735]: I1008 13:38:05.894760 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:38:05 crc kubenswrapper[4735]: E1008 13:38:05.894965 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:38:06 crc kubenswrapper[4735]: I1008 13:38:06.831056 4735 generic.go:334] "Generic (PLEG): container finished" podID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerID="021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a" exitCode=0 Oct 08 13:38:06 crc kubenswrapper[4735]: I1008 13:38:06.831186 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerDied","Data":"021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a"} Oct 08 13:38:07 crc kubenswrapper[4735]: I1008 13:38:07.842325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerStarted","Data":"6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518"} Oct 08 13:38:07 crc kubenswrapper[4735]: I1008 13:38:07.864921 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mvxv2" podStartSLOduration=2.287742173 podStartE2EDuration="4.864903483s" podCreationTimestamp="2025-10-08 13:38:03 +0000 UTC" firstStartedPulling="2025-10-08 13:38:04.814411231 +0000 UTC m=+2534.622715775" lastFinishedPulling="2025-10-08 13:38:07.391572541 +0000 UTC m=+2537.199877085" observedRunningTime="2025-10-08 13:38:07.859055073 +0000 UTC m=+2537.667359627" watchObservedRunningTime="2025-10-08 13:38:07.864903483 +0000 UTC m=+2537.673208027" Oct 08 13:38:14 crc kubenswrapper[4735]: I1008 13:38:14.234654 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:14 crc kubenswrapper[4735]: I1008 13:38:14.235286 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:14 crc kubenswrapper[4735]: I1008 13:38:14.292964 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:14 crc kubenswrapper[4735]: I1008 13:38:14.935881 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:14 crc kubenswrapper[4735]: I1008 13:38:14.992567 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvxv2"] Oct 08 13:38:16 crc kubenswrapper[4735]: I1008 13:38:16.905153 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mvxv2" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="registry-server" containerID="cri-o://6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518" gracePeriod=2 Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.302134 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.417701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8xgc\" (UniqueName: \"kubernetes.io/projected/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-kube-api-access-q8xgc\") pod \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.417853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-utilities\") pod \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.417906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-catalog-content\") pod \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\" (UID: \"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0\") " Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.419096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-utilities" (OuterVolumeSpecName: "utilities") pod "e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" (UID: "e93fa047-e66e-4aae-99c6-0dcdc13cc7e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.423075 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-kube-api-access-q8xgc" (OuterVolumeSpecName: "kube-api-access-q8xgc") pod "e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" (UID: "e93fa047-e66e-4aae-99c6-0dcdc13cc7e0"). InnerVolumeSpecName "kube-api-access-q8xgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.469341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" (UID: "e93fa047-e66e-4aae-99c6-0dcdc13cc7e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.519603 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.519644 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.519659 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8xgc\" (UniqueName: \"kubernetes.io/projected/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0-kube-api-access-q8xgc\") on node \"crc\" DevicePath \"\"" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.920405 4735 generic.go:334] "Generic (PLEG): container finished" podID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerID="6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518" exitCode=0 Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.920467 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerDied","Data":"6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518"} Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.920585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvxv2" event={"ID":"e93fa047-e66e-4aae-99c6-0dcdc13cc7e0","Type":"ContainerDied","Data":"e0e6bf2047db7959f4a378cd4f597e07cd960022db163ddb96b435736b65b741"} Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.920637 4735 scope.go:117] "RemoveContainer" containerID="6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.920525 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvxv2" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.962800 4735 scope.go:117] "RemoveContainer" containerID="021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a" Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.979926 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvxv2"] Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.987655 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mvxv2"] Oct 08 13:38:17 crc kubenswrapper[4735]: I1008 13:38:17.990463 4735 scope.go:117] "RemoveContainer" containerID="70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.023030 4735 scope.go:117] "RemoveContainer" containerID="6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518" Oct 08 13:38:18 crc kubenswrapper[4735]: E1008 13:38:18.023469 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518\": container with ID starting with 6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518 not found: ID does not exist" containerID="6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.023521 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518"} err="failed to get container status \"6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518\": rpc error: code = NotFound desc = could not find container \"6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518\": container with ID starting with 6d861db2b9e8595864f0cdfbc7073193a108bb09c6c0bcc45444bcd8f4130518 not found: ID does not exist" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.023545 4735 scope.go:117] "RemoveContainer" containerID="021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a" Oct 08 13:38:18 crc kubenswrapper[4735]: E1008 13:38:18.023863 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a\": container with ID starting with 021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a not found: ID does not exist" containerID="021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.023915 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a"} err="failed to get container status \"021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a\": rpc error: code = NotFound desc = could not find container \"021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a\": container with ID starting with 021eaea02cf744caa59839d5866df2e33fe43516a56b71f7031e2d431733e96a not found: ID does not exist" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.023944 4735 scope.go:117] "RemoveContainer" containerID="70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7" Oct 08 13:38:18 crc kubenswrapper[4735]: E1008 13:38:18.024250 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7\": container with ID starting with 70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7 not found: ID does not exist" containerID="70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.024284 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7"} err="failed to get container status \"70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7\": rpc error: code = NotFound desc = could not find container \"70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7\": container with ID starting with 70b7b48d138850c8b537a95d4e379690a2a28546f7597be538a4b0f2eab803d7 not found: ID does not exist" Oct 08 13:38:18 crc kubenswrapper[4735]: I1008 13:38:18.902799 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" path="/var/lib/kubelet/pods/e93fa047-e66e-4aae-99c6-0dcdc13cc7e0/volumes" Oct 08 13:38:19 crc kubenswrapper[4735]: I1008 13:38:19.894977 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:38:19 crc kubenswrapper[4735]: E1008 13:38:19.895454 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:38:31 crc kubenswrapper[4735]: I1008 13:38:31.894680 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:38:31 crc kubenswrapper[4735]: E1008 13:38:31.896651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:38:45 crc kubenswrapper[4735]: I1008 13:38:45.894915 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:38:45 crc kubenswrapper[4735]: E1008 13:38:45.897034 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:39:00 crc kubenswrapper[4735]: I1008 13:39:00.906117 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:39:00 crc kubenswrapper[4735]: E1008 13:39:00.906927 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:39:12 crc kubenswrapper[4735]: I1008 13:39:12.894852 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:39:12 crc kubenswrapper[4735]: E1008 13:39:12.895448 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:39:25 crc kubenswrapper[4735]: I1008 13:39:25.894416 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:39:25 crc kubenswrapper[4735]: E1008 13:39:25.895204 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:39:36 crc kubenswrapper[4735]: I1008 13:39:36.895194 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:39:36 crc kubenswrapper[4735]: E1008 13:39:36.896253 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:39:47 crc kubenswrapper[4735]: I1008 13:39:47.894731 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:39:47 crc kubenswrapper[4735]: E1008 13:39:47.895825 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:40:01 crc kubenswrapper[4735]: I1008 13:40:01.894439 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:40:01 crc kubenswrapper[4735]: E1008 13:40:01.895230 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:40:13 crc kubenswrapper[4735]: I1008 13:40:13.894647 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:40:13 crc kubenswrapper[4735]: E1008 13:40:13.895442 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:40:28 crc kubenswrapper[4735]: I1008 13:40:28.894966 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:40:28 crc kubenswrapper[4735]: E1008 13:40:28.895717 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.325843 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pftd7"] Oct 08 13:40:34 crc kubenswrapper[4735]: E1008 13:40:34.326444 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="extract-content" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.326457 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="extract-content" Oct 08 13:40:34 crc kubenswrapper[4735]: E1008 13:40:34.326466 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="extract-utilities" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.326488 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="extract-utilities" Oct 08 13:40:34 crc kubenswrapper[4735]: E1008 13:40:34.326501 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="registry-server" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.326507 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="registry-server" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.326676 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93fa047-e66e-4aae-99c6-0dcdc13cc7e0" containerName="registry-server" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.327715 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.334267 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pftd7"] Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.422607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-catalog-content\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.422662 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-utilities\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.422714 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92z5\" (UniqueName: \"kubernetes.io/projected/555ea53b-e14f-450c-adc5-5f427176a48e-kube-api-access-x92z5\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.524302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-catalog-content\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.524351 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-catalog-content\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.524398 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-utilities\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.524713 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-utilities\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.524760 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92z5\" (UniqueName: \"kubernetes.io/projected/555ea53b-e14f-450c-adc5-5f427176a48e-kube-api-access-x92z5\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.542594 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92z5\" (UniqueName: \"kubernetes.io/projected/555ea53b-e14f-450c-adc5-5f427176a48e-kube-api-access-x92z5\") pod \"redhat-operators-pftd7\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:34 crc kubenswrapper[4735]: I1008 13:40:34.647907 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:35 crc kubenswrapper[4735]: I1008 13:40:35.121373 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pftd7"] Oct 08 13:40:35 crc kubenswrapper[4735]: I1008 13:40:35.981598 4735 generic.go:334] "Generic (PLEG): container finished" podID="555ea53b-e14f-450c-adc5-5f427176a48e" containerID="18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624" exitCode=0 Oct 08 13:40:35 crc kubenswrapper[4735]: I1008 13:40:35.981730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pftd7" event={"ID":"555ea53b-e14f-450c-adc5-5f427176a48e","Type":"ContainerDied","Data":"18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624"} Oct 08 13:40:35 crc kubenswrapper[4735]: I1008 13:40:35.981892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pftd7" event={"ID":"555ea53b-e14f-450c-adc5-5f427176a48e","Type":"ContainerStarted","Data":"6fbc63239ccf6c7cef6ae72ef3465b1f83909a31c0dd1d07dbeab6efa7a454d4"} Oct 08 13:40:37 crc kubenswrapper[4735]: I1008 13:40:37.997750 4735 generic.go:334] "Generic (PLEG): container finished" podID="555ea53b-e14f-450c-adc5-5f427176a48e" containerID="2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c" exitCode=0 Oct 08 13:40:37 crc kubenswrapper[4735]: I1008 13:40:37.997823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pftd7" event={"ID":"555ea53b-e14f-450c-adc5-5f427176a48e","Type":"ContainerDied","Data":"2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c"} Oct 08 13:40:39 crc kubenswrapper[4735]: I1008 13:40:39.008642 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pftd7" event={"ID":"555ea53b-e14f-450c-adc5-5f427176a48e","Type":"ContainerStarted","Data":"a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff"} Oct 08 13:40:39 crc kubenswrapper[4735]: I1008 13:40:39.028645 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pftd7" podStartSLOduration=2.202789618 podStartE2EDuration="5.028626541s" podCreationTimestamp="2025-10-08 13:40:34 +0000 UTC" firstStartedPulling="2025-10-08 13:40:35.983863524 +0000 UTC m=+2685.792168108" lastFinishedPulling="2025-10-08 13:40:38.809700487 +0000 UTC m=+2688.618005031" observedRunningTime="2025-10-08 13:40:39.026331813 +0000 UTC m=+2688.834636367" watchObservedRunningTime="2025-10-08 13:40:39.028626541 +0000 UTC m=+2688.836931085" Oct 08 13:40:41 crc kubenswrapper[4735]: I1008 13:40:41.894538 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:40:41 crc kubenswrapper[4735]: E1008 13:40:41.896709 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:40:44 crc kubenswrapper[4735]: I1008 13:40:44.648535 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:44 crc kubenswrapper[4735]: I1008 13:40:44.648902 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:44 crc kubenswrapper[4735]: I1008 13:40:44.694456 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:45 crc kubenswrapper[4735]: I1008 13:40:45.091424 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:45 crc kubenswrapper[4735]: I1008 13:40:45.141169 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pftd7"] Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.070210 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pftd7" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="registry-server" containerID="cri-o://a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff" gracePeriod=2 Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.460045 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.512788 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-utilities\") pod \"555ea53b-e14f-450c-adc5-5f427176a48e\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.512837 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-catalog-content\") pod \"555ea53b-e14f-450c-adc5-5f427176a48e\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.512857 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x92z5\" (UniqueName: \"kubernetes.io/projected/555ea53b-e14f-450c-adc5-5f427176a48e-kube-api-access-x92z5\") pod \"555ea53b-e14f-450c-adc5-5f427176a48e\" (UID: \"555ea53b-e14f-450c-adc5-5f427176a48e\") " Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.513651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-utilities" (OuterVolumeSpecName: "utilities") pod "555ea53b-e14f-450c-adc5-5f427176a48e" (UID: "555ea53b-e14f-450c-adc5-5f427176a48e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.522691 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555ea53b-e14f-450c-adc5-5f427176a48e-kube-api-access-x92z5" (OuterVolumeSpecName: "kube-api-access-x92z5") pod "555ea53b-e14f-450c-adc5-5f427176a48e" (UID: "555ea53b-e14f-450c-adc5-5f427176a48e"). InnerVolumeSpecName "kube-api-access-x92z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.614441 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:40:47 crc kubenswrapper[4735]: I1008 13:40:47.614526 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x92z5\" (UniqueName: \"kubernetes.io/projected/555ea53b-e14f-450c-adc5-5f427176a48e-kube-api-access-x92z5\") on node \"crc\" DevicePath \"\"" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.080062 4735 generic.go:334] "Generic (PLEG): container finished" podID="555ea53b-e14f-450c-adc5-5f427176a48e" containerID="a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff" exitCode=0 Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.080133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pftd7" event={"ID":"555ea53b-e14f-450c-adc5-5f427176a48e","Type":"ContainerDied","Data":"a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff"} Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.080172 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pftd7" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.080188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pftd7" event={"ID":"555ea53b-e14f-450c-adc5-5f427176a48e","Type":"ContainerDied","Data":"6fbc63239ccf6c7cef6ae72ef3465b1f83909a31c0dd1d07dbeab6efa7a454d4"} Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.080212 4735 scope.go:117] "RemoveContainer" containerID="a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.096341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "555ea53b-e14f-450c-adc5-5f427176a48e" (UID: "555ea53b-e14f-450c-adc5-5f427176a48e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.101632 4735 scope.go:117] "RemoveContainer" containerID="2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.121334 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/555ea53b-e14f-450c-adc5-5f427176a48e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.125228 4735 scope.go:117] "RemoveContainer" containerID="18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.150719 4735 scope.go:117] "RemoveContainer" containerID="a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff" Oct 08 13:40:48 crc kubenswrapper[4735]: E1008 13:40:48.154022 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff\": container with ID starting with a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff not found: ID does not exist" containerID="a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.154069 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff"} err="failed to get container status \"a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff\": rpc error: code = NotFound desc = could not find container \"a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff\": container with ID starting with a6dcfd64d033e6d23729209c2429de79aa697f734223fdd24443a6b135d691ff not found: ID does not exist" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.154096 4735 scope.go:117] "RemoveContainer" containerID="2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c" Oct 08 13:40:48 crc kubenswrapper[4735]: E1008 13:40:48.154508 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c\": container with ID starting with 2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c not found: ID does not exist" containerID="2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.154567 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c"} err="failed to get container status \"2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c\": rpc error: code = NotFound desc = could not find container \"2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c\": container with ID starting with 2e67a1761eb00357592e72dbb6bbdac6eb02508ed9e2fc6449f0a7a82a95c05c not found: ID does not exist" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.154597 4735 scope.go:117] "RemoveContainer" containerID="18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624" Oct 08 13:40:48 crc kubenswrapper[4735]: E1008 13:40:48.154936 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624\": container with ID starting with 18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624 not found: ID does not exist" containerID="18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.154970 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624"} err="failed to get container status \"18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624\": rpc error: code = NotFound desc = could not find container \"18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624\": container with ID starting with 18942e0a619d675e57f42f79ec575a6001c25fb8a9bc29576cf7fcbd3aa59624 not found: ID does not exist" Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.414678 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pftd7"] Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.420060 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pftd7"] Oct 08 13:40:48 crc kubenswrapper[4735]: I1008 13:40:48.905145 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" path="/var/lib/kubelet/pods/555ea53b-e14f-450c-adc5-5f427176a48e/volumes" Oct 08 13:40:53 crc kubenswrapper[4735]: I1008 13:40:53.895193 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:40:53 crc kubenswrapper[4735]: E1008 13:40:53.895910 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:41:05 crc kubenswrapper[4735]: I1008 13:41:05.894932 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:41:06 crc kubenswrapper[4735]: I1008 13:41:06.204990 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"e5dcfca1afb18b5f593ef936112711da3b9ca005ba0d5c5666cf063014f22582"} Oct 08 13:43:24 crc kubenswrapper[4735]: I1008 13:43:24.878359 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:43:24 crc kubenswrapper[4735]: I1008 13:43:24.878893 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:43:54 crc kubenswrapper[4735]: I1008 13:43:54.878591 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:43:54 crc kubenswrapper[4735]: I1008 13:43:54.879127 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:44:24 crc kubenswrapper[4735]: I1008 13:44:24.878536 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:44:24 crc kubenswrapper[4735]: I1008 13:44:24.878963 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:44:24 crc kubenswrapper[4735]: I1008 13:44:24.879009 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:44:24 crc kubenswrapper[4735]: I1008 13:44:24.879580 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5dcfca1afb18b5f593ef936112711da3b9ca005ba0d5c5666cf063014f22582"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:44:24 crc kubenswrapper[4735]: I1008 13:44:24.879628 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://e5dcfca1afb18b5f593ef936112711da3b9ca005ba0d5c5666cf063014f22582" gracePeriod=600 Oct 08 13:44:25 crc kubenswrapper[4735]: I1008 13:44:25.681750 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="e5dcfca1afb18b5f593ef936112711da3b9ca005ba0d5c5666cf063014f22582" exitCode=0 Oct 08 13:44:25 crc kubenswrapper[4735]: I1008 13:44:25.681825 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"e5dcfca1afb18b5f593ef936112711da3b9ca005ba0d5c5666cf063014f22582"} Oct 08 13:44:25 crc kubenswrapper[4735]: I1008 13:44:25.682138 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261"} Oct 08 13:44:25 crc kubenswrapper[4735]: I1008 13:44:25.682164 4735 scope.go:117] "RemoveContainer" containerID="f64dba97715cec77e8565589615359f392f2525762d14d1d5d732144ada2c2c2" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.189703 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q"] Oct 08 13:45:00 crc kubenswrapper[4735]: E1008 13:45:00.190621 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="registry-server" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.190638 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="registry-server" Oct 08 13:45:00 crc kubenswrapper[4735]: E1008 13:45:00.190659 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="extract-utilities" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.190665 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="extract-utilities" Oct 08 13:45:00 crc kubenswrapper[4735]: E1008 13:45:00.190682 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="extract-content" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.190690 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="extract-content" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.190834 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="555ea53b-e14f-450c-adc5-5f427176a48e" containerName="registry-server" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.191382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.194020 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.194105 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.200883 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q"] Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.225562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29d927ff-0031-4725-a576-01a9f4cf5fa3-config-volume\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.225657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29d927ff-0031-4725-a576-01a9f4cf5fa3-secret-volume\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.225750 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwgr\" (UniqueName: \"kubernetes.io/projected/29d927ff-0031-4725-a576-01a9f4cf5fa3-kube-api-access-kgwgr\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.327006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgwgr\" (UniqueName: \"kubernetes.io/projected/29d927ff-0031-4725-a576-01a9f4cf5fa3-kube-api-access-kgwgr\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.327095 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29d927ff-0031-4725-a576-01a9f4cf5fa3-config-volume\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.327134 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29d927ff-0031-4725-a576-01a9f4cf5fa3-secret-volume\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.331377 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29d927ff-0031-4725-a576-01a9f4cf5fa3-config-volume\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.332775 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29d927ff-0031-4725-a576-01a9f4cf5fa3-secret-volume\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.342662 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgwgr\" (UniqueName: \"kubernetes.io/projected/29d927ff-0031-4725-a576-01a9f4cf5fa3-kube-api-access-kgwgr\") pod \"collect-profiles-29332185-8q22q\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.510028 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.926092 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q"] Oct 08 13:45:00 crc kubenswrapper[4735]: I1008 13:45:00.955679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" event={"ID":"29d927ff-0031-4725-a576-01a9f4cf5fa3","Type":"ContainerStarted","Data":"75da70efc3d8b6290faa9eecd695731882aef28725063fa31083821ac3e9f635"} Oct 08 13:45:01 crc kubenswrapper[4735]: I1008 13:45:01.971641 4735 generic.go:334] "Generic (PLEG): container finished" podID="29d927ff-0031-4725-a576-01a9f4cf5fa3" containerID="d0eed91976ed4bbff74fdaf27b347df9f7c2448c4ddefb25b4adb99d96d9e574" exitCode=0 Oct 08 13:45:01 crc kubenswrapper[4735]: I1008 13:45:01.971704 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" event={"ID":"29d927ff-0031-4725-a576-01a9f4cf5fa3","Type":"ContainerDied","Data":"d0eed91976ed4bbff74fdaf27b347df9f7c2448c4ddefb25b4adb99d96d9e574"} Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.221680 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.366821 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgwgr\" (UniqueName: \"kubernetes.io/projected/29d927ff-0031-4725-a576-01a9f4cf5fa3-kube-api-access-kgwgr\") pod \"29d927ff-0031-4725-a576-01a9f4cf5fa3\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.366990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29d927ff-0031-4725-a576-01a9f4cf5fa3-secret-volume\") pod \"29d927ff-0031-4725-a576-01a9f4cf5fa3\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.367010 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29d927ff-0031-4725-a576-01a9f4cf5fa3-config-volume\") pod \"29d927ff-0031-4725-a576-01a9f4cf5fa3\" (UID: \"29d927ff-0031-4725-a576-01a9f4cf5fa3\") " Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.368074 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d927ff-0031-4725-a576-01a9f4cf5fa3-config-volume" (OuterVolumeSpecName: "config-volume") pod "29d927ff-0031-4725-a576-01a9f4cf5fa3" (UID: "29d927ff-0031-4725-a576-01a9f4cf5fa3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.372626 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d927ff-0031-4725-a576-01a9f4cf5fa3-kube-api-access-kgwgr" (OuterVolumeSpecName: "kube-api-access-kgwgr") pod "29d927ff-0031-4725-a576-01a9f4cf5fa3" (UID: "29d927ff-0031-4725-a576-01a9f4cf5fa3"). InnerVolumeSpecName "kube-api-access-kgwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.372814 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d927ff-0031-4725-a576-01a9f4cf5fa3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29d927ff-0031-4725-a576-01a9f4cf5fa3" (UID: "29d927ff-0031-4725-a576-01a9f4cf5fa3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.468392 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29d927ff-0031-4725-a576-01a9f4cf5fa3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.468439 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29d927ff-0031-4725-a576-01a9f4cf5fa3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.468454 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgwgr\" (UniqueName: \"kubernetes.io/projected/29d927ff-0031-4725-a576-01a9f4cf5fa3-kube-api-access-kgwgr\") on node \"crc\" DevicePath \"\"" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.988945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" event={"ID":"29d927ff-0031-4725-a576-01a9f4cf5fa3","Type":"ContainerDied","Data":"75da70efc3d8b6290faa9eecd695731882aef28725063fa31083821ac3e9f635"} Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.988992 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75da70efc3d8b6290faa9eecd695731882aef28725063fa31083821ac3e9f635" Oct 08 13:45:03 crc kubenswrapper[4735]: I1008 13:45:03.989015 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q" Oct 08 13:45:04 crc kubenswrapper[4735]: I1008 13:45:04.293836 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48"] Oct 08 13:45:04 crc kubenswrapper[4735]: I1008 13:45:04.298894 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332140-nbw48"] Oct 08 13:45:04 crc kubenswrapper[4735]: I1008 13:45:04.905473 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30287a9c-9d15-434b-95a4-549d6cbe491b" path="/var/lib/kubelet/pods/30287a9c-9d15-434b-95a4-549d6cbe491b/volumes" Oct 08 13:45:06 crc kubenswrapper[4735]: I1008 13:45:06.905185 4735 scope.go:117] "RemoveContainer" containerID="13f11ef6a7c873775b11fefc2a1cc60906dbaf3cb69cbea566f5676133120596" Oct 08 13:46:47 crc kubenswrapper[4735]: I1008 13:46:47.853232 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7fcx2"] Oct 08 13:46:47 crc kubenswrapper[4735]: E1008 13:46:47.854044 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d927ff-0031-4725-a576-01a9f4cf5fa3" containerName="collect-profiles" Oct 08 13:46:47 crc kubenswrapper[4735]: I1008 13:46:47.854055 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d927ff-0031-4725-a576-01a9f4cf5fa3" containerName="collect-profiles" Oct 08 13:46:47 crc kubenswrapper[4735]: I1008 13:46:47.854226 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d927ff-0031-4725-a576-01a9f4cf5fa3" containerName="collect-profiles" Oct 08 13:46:47 crc kubenswrapper[4735]: I1008 13:46:47.855394 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:47 crc kubenswrapper[4735]: I1008 13:46:47.873771 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7fcx2"] Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.008236 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ct4\" (UniqueName: \"kubernetes.io/projected/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-kube-api-access-54ct4\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.008328 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-utilities\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.008396 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-catalog-content\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.110305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ct4\" (UniqueName: \"kubernetes.io/projected/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-kube-api-access-54ct4\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.110385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-utilities\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.110431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-catalog-content\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.111013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-utilities\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.111049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-catalog-content\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.130943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ct4\" (UniqueName: \"kubernetes.io/projected/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-kube-api-access-54ct4\") pod \"certified-operators-7fcx2\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.179231 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.447293 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7fcx2"] Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.786992 4735 generic.go:334] "Generic (PLEG): container finished" podID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerID="21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635" exitCode=0 Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.787050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerDied","Data":"21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635"} Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.787078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerStarted","Data":"532b5a360116463515dce5a596f776309635e03c0747f2e675b787485103324f"} Oct 08 13:46:48 crc kubenswrapper[4735]: I1008 13:46:48.788655 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:46:49 crc kubenswrapper[4735]: I1008 13:46:49.795314 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerStarted","Data":"22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04"} Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.280558 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6xzzp"] Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.282561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.293251 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xzzp"] Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.445943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-catalog-content\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.446308 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvrr\" (UniqueName: \"kubernetes.io/projected/85377996-bd55-4791-b107-249fca95468a-kube-api-access-6jvrr\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.446450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-utilities\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.547827 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvrr\" (UniqueName: \"kubernetes.io/projected/85377996-bd55-4791-b107-249fca95468a-kube-api-access-6jvrr\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.547902 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-utilities\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.547975 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-catalog-content\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.548460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-catalog-content\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.548961 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-utilities\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.569081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvrr\" (UniqueName: \"kubernetes.io/projected/85377996-bd55-4791-b107-249fca95468a-kube-api-access-6jvrr\") pod \"redhat-marketplace-6xzzp\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.606157 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.803339 4735 generic.go:334] "Generic (PLEG): container finished" podID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerID="22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04" exitCode=0 Oct 08 13:46:50 crc kubenswrapper[4735]: I1008 13:46:50.803382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerDied","Data":"22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04"} Oct 08 13:46:51 crc kubenswrapper[4735]: I1008 13:46:51.013464 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xzzp"] Oct 08 13:46:51 crc kubenswrapper[4735]: I1008 13:46:51.811990 4735 generic.go:334] "Generic (PLEG): container finished" podID="85377996-bd55-4791-b107-249fca95468a" containerID="ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87" exitCode=0 Oct 08 13:46:51 crc kubenswrapper[4735]: I1008 13:46:51.812092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xzzp" event={"ID":"85377996-bd55-4791-b107-249fca95468a","Type":"ContainerDied","Data":"ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87"} Oct 08 13:46:51 crc kubenswrapper[4735]: I1008 13:46:51.812373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xzzp" event={"ID":"85377996-bd55-4791-b107-249fca95468a","Type":"ContainerStarted","Data":"89774127ab58c25dc46a313dd313d578ac2440f83345dc7dff4f30c6b9a79aa6"} Oct 08 13:46:51 crc kubenswrapper[4735]: I1008 13:46:51.814521 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerStarted","Data":"83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990"} Oct 08 13:46:51 crc kubenswrapper[4735]: I1008 13:46:51.847495 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7fcx2" podStartSLOduration=2.421661657 podStartE2EDuration="4.84746118s" podCreationTimestamp="2025-10-08 13:46:47 +0000 UTC" firstStartedPulling="2025-10-08 13:46:48.788364946 +0000 UTC m=+3058.596669480" lastFinishedPulling="2025-10-08 13:46:51.214164459 +0000 UTC m=+3061.022469003" observedRunningTime="2025-10-08 13:46:51.844007182 +0000 UTC m=+3061.652311736" watchObservedRunningTime="2025-10-08 13:46:51.84746118 +0000 UTC m=+3061.655765714" Oct 08 13:46:53 crc kubenswrapper[4735]: I1008 13:46:53.830915 4735 generic.go:334] "Generic (PLEG): container finished" podID="85377996-bd55-4791-b107-249fca95468a" containerID="638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8" exitCode=0 Oct 08 13:46:53 crc kubenswrapper[4735]: I1008 13:46:53.830982 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xzzp" event={"ID":"85377996-bd55-4791-b107-249fca95468a","Type":"ContainerDied","Data":"638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8"} Oct 08 13:46:54 crc kubenswrapper[4735]: I1008 13:46:54.842643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xzzp" event={"ID":"85377996-bd55-4791-b107-249fca95468a","Type":"ContainerStarted","Data":"9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d"} Oct 08 13:46:54 crc kubenswrapper[4735]: I1008 13:46:54.866611 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6xzzp" podStartSLOduration=2.385260066 podStartE2EDuration="4.866586961s" podCreationTimestamp="2025-10-08 13:46:50 +0000 UTC" firstStartedPulling="2025-10-08 13:46:51.814091156 +0000 UTC m=+3061.622395700" lastFinishedPulling="2025-10-08 13:46:54.295418051 +0000 UTC m=+3064.103722595" observedRunningTime="2025-10-08 13:46:54.859296004 +0000 UTC m=+3064.667600588" watchObservedRunningTime="2025-10-08 13:46:54.866586961 +0000 UTC m=+3064.674891515" Oct 08 13:46:54 crc kubenswrapper[4735]: I1008 13:46:54.878702 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:46:54 crc kubenswrapper[4735]: I1008 13:46:54.878803 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:46:58 crc kubenswrapper[4735]: I1008 13:46:58.179296 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:58 crc kubenswrapper[4735]: I1008 13:46:58.179769 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:58 crc kubenswrapper[4735]: I1008 13:46:58.230591 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:58 crc kubenswrapper[4735]: I1008 13:46:58.931352 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:46:58 crc kubenswrapper[4735]: I1008 13:46:58.978184 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7fcx2"] Oct 08 13:47:00 crc kubenswrapper[4735]: I1008 13:47:00.606527 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:47:00 crc kubenswrapper[4735]: I1008 13:47:00.606888 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:47:00 crc kubenswrapper[4735]: I1008 13:47:00.679949 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:47:00 crc kubenswrapper[4735]: I1008 13:47:00.893020 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7fcx2" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="registry-server" containerID="cri-o://83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990" gracePeriod=2 Oct 08 13:47:00 crc kubenswrapper[4735]: I1008 13:47:00.943251 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.323534 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.505312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54ct4\" (UniqueName: \"kubernetes.io/projected/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-kube-api-access-54ct4\") pod \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.505489 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-utilities\") pod \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.505532 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-catalog-content\") pod \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\" (UID: \"96b4e5bd-a7d9-475f-9faa-8de07ec439bc\") " Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.506666 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-utilities" (OuterVolumeSpecName: "utilities") pod "96b4e5bd-a7d9-475f-9faa-8de07ec439bc" (UID: "96b4e5bd-a7d9-475f-9faa-8de07ec439bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.514116 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-kube-api-access-54ct4" (OuterVolumeSpecName: "kube-api-access-54ct4") pod "96b4e5bd-a7d9-475f-9faa-8de07ec439bc" (UID: "96b4e5bd-a7d9-475f-9faa-8de07ec439bc"). InnerVolumeSpecName "kube-api-access-54ct4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.604918 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96b4e5bd-a7d9-475f-9faa-8de07ec439bc" (UID: "96b4e5bd-a7d9-475f-9faa-8de07ec439bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.606990 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.607031 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.607045 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54ct4\" (UniqueName: \"kubernetes.io/projected/96b4e5bd-a7d9-475f-9faa-8de07ec439bc-kube-api-access-54ct4\") on node \"crc\" DevicePath \"\"" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.907954 4735 generic.go:334] "Generic (PLEG): container finished" podID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerID="83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990" exitCode=0 Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.908036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerDied","Data":"83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990"} Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.908084 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fcx2" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.908107 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fcx2" event={"ID":"96b4e5bd-a7d9-475f-9faa-8de07ec439bc","Type":"ContainerDied","Data":"532b5a360116463515dce5a596f776309635e03c0747f2e675b787485103324f"} Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.908143 4735 scope.go:117] "RemoveContainer" containerID="83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.938729 4735 scope.go:117] "RemoveContainer" containerID="22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.955686 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7fcx2"] Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.963160 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7fcx2"] Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.963385 4735 scope.go:117] "RemoveContainer" containerID="21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.996953 4735 scope.go:117] "RemoveContainer" containerID="83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990" Oct 08 13:47:01 crc kubenswrapper[4735]: E1008 13:47:01.997662 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990\": container with ID starting with 83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990 not found: ID does not exist" containerID="83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.997736 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990"} err="failed to get container status \"83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990\": rpc error: code = NotFound desc = could not find container \"83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990\": container with ID starting with 83f0bed15e10b86c8571d55f3c07f75c47311d5fdec8afb3574355127d766990 not found: ID does not exist" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.997778 4735 scope.go:117] "RemoveContainer" containerID="22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04" Oct 08 13:47:01 crc kubenswrapper[4735]: E1008 13:47:01.998299 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04\": container with ID starting with 22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04 not found: ID does not exist" containerID="22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.998349 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04"} err="failed to get container status \"22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04\": rpc error: code = NotFound desc = could not find container \"22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04\": container with ID starting with 22de76dc9a76e0da11db59c72041c9dac2f8ec708a77635f9ea506f646d18e04 not found: ID does not exist" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.998384 4735 scope.go:117] "RemoveContainer" containerID="21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635" Oct 08 13:47:01 crc kubenswrapper[4735]: E1008 13:47:01.999044 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635\": container with ID starting with 21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635 not found: ID does not exist" containerID="21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635" Oct 08 13:47:01 crc kubenswrapper[4735]: I1008 13:47:01.999072 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635"} err="failed to get container status \"21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635\": rpc error: code = NotFound desc = could not find container \"21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635\": container with ID starting with 21a488c6dbc8dbd2c0e84616d9e83f90a0bb5057169aee38a0ba2b8042edb635 not found: ID does not exist" Oct 08 13:47:02 crc kubenswrapper[4735]: I1008 13:47:02.269655 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xzzp"] Oct 08 13:47:02 crc kubenswrapper[4735]: I1008 13:47:02.906144 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" path="/var/lib/kubelet/pods/96b4e5bd-a7d9-475f-9faa-8de07ec439bc/volumes" Oct 08 13:47:02 crc kubenswrapper[4735]: I1008 13:47:02.917086 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6xzzp" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="registry-server" containerID="cri-o://9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d" gracePeriod=2 Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.800399 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.924783 4735 generic.go:334] "Generic (PLEG): container finished" podID="85377996-bd55-4791-b107-249fca95468a" containerID="9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d" exitCode=0 Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.924828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xzzp" event={"ID":"85377996-bd55-4791-b107-249fca95468a","Type":"ContainerDied","Data":"9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d"} Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.924862 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xzzp" event={"ID":"85377996-bd55-4791-b107-249fca95468a","Type":"ContainerDied","Data":"89774127ab58c25dc46a313dd313d578ac2440f83345dc7dff4f30c6b9a79aa6"} Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.924881 4735 scope.go:117] "RemoveContainer" containerID="9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.925011 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xzzp" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.944130 4735 scope.go:117] "RemoveContainer" containerID="638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.958887 4735 scope.go:117] "RemoveContainer" containerID="ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.959106 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-utilities\") pod \"85377996-bd55-4791-b107-249fca95468a\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.959200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-catalog-content\") pod \"85377996-bd55-4791-b107-249fca95468a\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.959283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jvrr\" (UniqueName: \"kubernetes.io/projected/85377996-bd55-4791-b107-249fca95468a-kube-api-access-6jvrr\") pod \"85377996-bd55-4791-b107-249fca95468a\" (UID: \"85377996-bd55-4791-b107-249fca95468a\") " Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.960058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-utilities" (OuterVolumeSpecName: "utilities") pod "85377996-bd55-4791-b107-249fca95468a" (UID: "85377996-bd55-4791-b107-249fca95468a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.968831 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85377996-bd55-4791-b107-249fca95468a-kube-api-access-6jvrr" (OuterVolumeSpecName: "kube-api-access-6jvrr") pod "85377996-bd55-4791-b107-249fca95468a" (UID: "85377996-bd55-4791-b107-249fca95468a"). InnerVolumeSpecName "kube-api-access-6jvrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:47:03 crc kubenswrapper[4735]: I1008 13:47:03.972649 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85377996-bd55-4791-b107-249fca95468a" (UID: "85377996-bd55-4791-b107-249fca95468a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.005666 4735 scope.go:117] "RemoveContainer" containerID="9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d" Oct 08 13:47:04 crc kubenswrapper[4735]: E1008 13:47:04.006121 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d\": container with ID starting with 9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d not found: ID does not exist" containerID="9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.006181 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d"} err="failed to get container status \"9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d\": rpc error: code = NotFound desc = could not find container \"9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d\": container with ID starting with 9ef7664446f244612a3288e6405ce395126ce8672c1d0703e7d77bcf6ebd435d not found: ID does not exist" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.006215 4735 scope.go:117] "RemoveContainer" containerID="638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8" Oct 08 13:47:04 crc kubenswrapper[4735]: E1008 13:47:04.006742 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8\": container with ID starting with 638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8 not found: ID does not exist" containerID="638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.006770 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8"} err="failed to get container status \"638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8\": rpc error: code = NotFound desc = could not find container \"638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8\": container with ID starting with 638444c5e57c36d95311d66f96e3caba6ce161a87b017ad26b236141504fd6d8 not found: ID does not exist" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.006787 4735 scope.go:117] "RemoveContainer" containerID="ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87" Oct 08 13:47:04 crc kubenswrapper[4735]: E1008 13:47:04.007039 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87\": container with ID starting with ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87 not found: ID does not exist" containerID="ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.007079 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87"} err="failed to get container status \"ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87\": rpc error: code = NotFound desc = could not find container \"ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87\": container with ID starting with ff1d7ab4b9501aa8b0fa3032962a33e11c3c77dc311fae5e23ca3bfc1c065a87 not found: ID does not exist" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.061280 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.061334 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85377996-bd55-4791-b107-249fca95468a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.061358 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jvrr\" (UniqueName: \"kubernetes.io/projected/85377996-bd55-4791-b107-249fca95468a-kube-api-access-6jvrr\") on node \"crc\" DevicePath \"\"" Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.256208 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xzzp"] Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.261424 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xzzp"] Oct 08 13:47:04 crc kubenswrapper[4735]: I1008 13:47:04.906701 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85377996-bd55-4791-b107-249fca95468a" path="/var/lib/kubelet/pods/85377996-bd55-4791-b107-249fca95468a/volumes" Oct 08 13:47:24 crc kubenswrapper[4735]: I1008 13:47:24.878371 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:47:24 crc kubenswrapper[4735]: I1008 13:47:24.880213 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:47:54 crc kubenswrapper[4735]: I1008 13:47:54.878198 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:47:54 crc kubenswrapper[4735]: I1008 13:47:54.878722 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:47:54 crc kubenswrapper[4735]: I1008 13:47:54.878766 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:47:54 crc kubenswrapper[4735]: I1008 13:47:54.879346 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:47:54 crc kubenswrapper[4735]: I1008 13:47:54.879400 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" gracePeriod=600 Oct 08 13:47:55 crc kubenswrapper[4735]: E1008 13:47:55.000015 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:47:55 crc kubenswrapper[4735]: I1008 13:47:55.280060 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" exitCode=0 Oct 08 13:47:55 crc kubenswrapper[4735]: I1008 13:47:55.280105 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261"} Oct 08 13:47:55 crc kubenswrapper[4735]: I1008 13:47:55.280137 4735 scope.go:117] "RemoveContainer" containerID="e5dcfca1afb18b5f593ef936112711da3b9ca005ba0d5c5666cf063014f22582" Oct 08 13:47:55 crc kubenswrapper[4735]: I1008 13:47:55.280606 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:47:55 crc kubenswrapper[4735]: E1008 13:47:55.280844 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:48:08 crc kubenswrapper[4735]: I1008 13:48:08.895108 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:48:08 crc kubenswrapper[4735]: E1008 13:48:08.898202 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:48:19 crc kubenswrapper[4735]: I1008 13:48:19.894898 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:48:19 crc kubenswrapper[4735]: E1008 13:48:19.895662 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:48:33 crc kubenswrapper[4735]: I1008 13:48:33.895040 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:48:33 crc kubenswrapper[4735]: E1008 13:48:33.895792 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:48:48 crc kubenswrapper[4735]: I1008 13:48:48.894658 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:48:48 crc kubenswrapper[4735]: E1008 13:48:48.895623 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:49:03 crc kubenswrapper[4735]: I1008 13:49:03.895136 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:49:03 crc kubenswrapper[4735]: E1008 13:49:03.897908 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:49:17 crc kubenswrapper[4735]: I1008 13:49:17.895371 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:49:17 crc kubenswrapper[4735]: E1008 13:49:17.896189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:49:30 crc kubenswrapper[4735]: I1008 13:49:30.900451 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:49:30 crc kubenswrapper[4735]: E1008 13:49:30.901563 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:49:42 crc kubenswrapper[4735]: I1008 13:49:42.897003 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:49:42 crc kubenswrapper[4735]: E1008 13:49:42.897766 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.887068 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hzvrq"] Oct 08 13:49:48 crc kubenswrapper[4735]: E1008 13:49:48.888197 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="extract-utilities" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888234 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="extract-utilities" Oct 08 13:49:48 crc kubenswrapper[4735]: E1008 13:49:48.888265 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="extract-content" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888281 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="extract-content" Oct 08 13:49:48 crc kubenswrapper[4735]: E1008 13:49:48.888310 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="extract-utilities" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888328 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="extract-utilities" Oct 08 13:49:48 crc kubenswrapper[4735]: E1008 13:49:48.888350 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="registry-server" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888366 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="registry-server" Oct 08 13:49:48 crc kubenswrapper[4735]: E1008 13:49:48.888419 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="registry-server" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888433 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="registry-server" Oct 08 13:49:48 crc kubenswrapper[4735]: E1008 13:49:48.888454 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="extract-content" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888466 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="extract-content" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888758 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="85377996-bd55-4791-b107-249fca95468a" containerName="registry-server" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.888778 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b4e5bd-a7d9-475f-9faa-8de07ec439bc" containerName="registry-server" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.890898 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.921363 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzvrq"] Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.988880 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-utilities\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.989838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5z97\" (UniqueName: \"kubernetes.io/projected/e0b15269-ff15-4850-bfc4-f0299cdf4819-kube-api-access-m5z97\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:48 crc kubenswrapper[4735]: I1008 13:49:48.990159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-catalog-content\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.092222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-utilities\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.092292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5z97\" (UniqueName: \"kubernetes.io/projected/e0b15269-ff15-4850-bfc4-f0299cdf4819-kube-api-access-m5z97\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.092326 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-catalog-content\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.092750 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-utilities\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.092827 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-catalog-content\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.110930 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5z97\" (UniqueName: \"kubernetes.io/projected/e0b15269-ff15-4850-bfc4-f0299cdf4819-kube-api-access-m5z97\") pod \"community-operators-hzvrq\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.229750 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:49 crc kubenswrapper[4735]: I1008 13:49:49.689902 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzvrq"] Oct 08 13:49:50 crc kubenswrapper[4735]: I1008 13:49:50.164694 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerID="0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9" exitCode=0 Oct 08 13:49:50 crc kubenswrapper[4735]: I1008 13:49:50.164747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzvrq" event={"ID":"e0b15269-ff15-4850-bfc4-f0299cdf4819","Type":"ContainerDied","Data":"0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9"} Oct 08 13:49:50 crc kubenswrapper[4735]: I1008 13:49:50.164984 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzvrq" event={"ID":"e0b15269-ff15-4850-bfc4-f0299cdf4819","Type":"ContainerStarted","Data":"afce835bf5239d9ff3ef295da2cc8f0f66af52a8d446fed5caf41adef1eae2d7"} Oct 08 13:49:52 crc kubenswrapper[4735]: I1008 13:49:52.185260 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerID="c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278" exitCode=0 Oct 08 13:49:52 crc kubenswrapper[4735]: I1008 13:49:52.185371 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzvrq" event={"ID":"e0b15269-ff15-4850-bfc4-f0299cdf4819","Type":"ContainerDied","Data":"c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278"} Oct 08 13:49:53 crc kubenswrapper[4735]: I1008 13:49:53.193732 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzvrq" event={"ID":"e0b15269-ff15-4850-bfc4-f0299cdf4819","Type":"ContainerStarted","Data":"7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62"} Oct 08 13:49:53 crc kubenswrapper[4735]: I1008 13:49:53.221148 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hzvrq" podStartSLOduration=2.791978223 podStartE2EDuration="5.221130005s" podCreationTimestamp="2025-10-08 13:49:48 +0000 UTC" firstStartedPulling="2025-10-08 13:49:50.165885425 +0000 UTC m=+3239.974189969" lastFinishedPulling="2025-10-08 13:49:52.595037207 +0000 UTC m=+3242.403341751" observedRunningTime="2025-10-08 13:49:53.217750738 +0000 UTC m=+3243.026055292" watchObservedRunningTime="2025-10-08 13:49:53.221130005 +0000 UTC m=+3243.029434549" Oct 08 13:49:53 crc kubenswrapper[4735]: I1008 13:49:53.895517 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:49:53 crc kubenswrapper[4735]: E1008 13:49:53.895729 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:49:59 crc kubenswrapper[4735]: I1008 13:49:59.230386 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:59 crc kubenswrapper[4735]: I1008 13:49:59.231240 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:49:59 crc kubenswrapper[4735]: I1008 13:49:59.272399 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:50:00 crc kubenswrapper[4735]: I1008 13:50:00.311570 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:50:00 crc kubenswrapper[4735]: I1008 13:50:00.366868 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzvrq"] Oct 08 13:50:02 crc kubenswrapper[4735]: I1008 13:50:02.261990 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hzvrq" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="registry-server" containerID="cri-o://7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62" gracePeriod=2 Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.235189 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.270441 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerID="7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62" exitCode=0 Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.270486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzvrq" event={"ID":"e0b15269-ff15-4850-bfc4-f0299cdf4819","Type":"ContainerDied","Data":"7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62"} Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.270547 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzvrq" event={"ID":"e0b15269-ff15-4850-bfc4-f0299cdf4819","Type":"ContainerDied","Data":"afce835bf5239d9ff3ef295da2cc8f0f66af52a8d446fed5caf41adef1eae2d7"} Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.270565 4735 scope.go:117] "RemoveContainer" containerID="7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.270560 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzvrq" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.286077 4735 scope.go:117] "RemoveContainer" containerID="c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.306364 4735 scope.go:117] "RemoveContainer" containerID="0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.334653 4735 scope.go:117] "RemoveContainer" containerID="7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62" Oct 08 13:50:03 crc kubenswrapper[4735]: E1008 13:50:03.335118 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62\": container with ID starting with 7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62 not found: ID does not exist" containerID="7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.335195 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62"} err="failed to get container status \"7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62\": rpc error: code = NotFound desc = could not find container \"7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62\": container with ID starting with 7f0a8711a385fd6a032c7bac4dbf54769d797aaa691e4f05f95b79f3d3b69b62 not found: ID does not exist" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.335235 4735 scope.go:117] "RemoveContainer" containerID="c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278" Oct 08 13:50:03 crc kubenswrapper[4735]: E1008 13:50:03.335694 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278\": container with ID starting with c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278 not found: ID does not exist" containerID="c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.335741 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278"} err="failed to get container status \"c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278\": rpc error: code = NotFound desc = could not find container \"c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278\": container with ID starting with c8494fba79a0ba76eb69a6752cb67fdcc26816db2762671019e64b163eb34278 not found: ID does not exist" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.335774 4735 scope.go:117] "RemoveContainer" containerID="0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9" Oct 08 13:50:03 crc kubenswrapper[4735]: E1008 13:50:03.336089 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9\": container with ID starting with 0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9 not found: ID does not exist" containerID="0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.336125 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9"} err="failed to get container status \"0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9\": rpc error: code = NotFound desc = could not find container \"0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9\": container with ID starting with 0a4f84f0be1378f39d5e45ef2ef8c7c38c0534539df67993ac3834664405c1a9 not found: ID does not exist" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.406287 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-catalog-content\") pod \"e0b15269-ff15-4850-bfc4-f0299cdf4819\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.406378 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5z97\" (UniqueName: \"kubernetes.io/projected/e0b15269-ff15-4850-bfc4-f0299cdf4819-kube-api-access-m5z97\") pod \"e0b15269-ff15-4850-bfc4-f0299cdf4819\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.406422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-utilities\") pod \"e0b15269-ff15-4850-bfc4-f0299cdf4819\" (UID: \"e0b15269-ff15-4850-bfc4-f0299cdf4819\") " Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.407360 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-utilities" (OuterVolumeSpecName: "utilities") pod "e0b15269-ff15-4850-bfc4-f0299cdf4819" (UID: "e0b15269-ff15-4850-bfc4-f0299cdf4819"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.412630 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b15269-ff15-4850-bfc4-f0299cdf4819-kube-api-access-m5z97" (OuterVolumeSpecName: "kube-api-access-m5z97") pod "e0b15269-ff15-4850-bfc4-f0299cdf4819" (UID: "e0b15269-ff15-4850-bfc4-f0299cdf4819"). InnerVolumeSpecName "kube-api-access-m5z97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.458893 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0b15269-ff15-4850-bfc4-f0299cdf4819" (UID: "e0b15269-ff15-4850-bfc4-f0299cdf4819"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.508365 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.508733 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5z97\" (UniqueName: \"kubernetes.io/projected/e0b15269-ff15-4850-bfc4-f0299cdf4819-kube-api-access-m5z97\") on node \"crc\" DevicePath \"\"" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.508836 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b15269-ff15-4850-bfc4-f0299cdf4819-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.618847 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzvrq"] Oct 08 13:50:03 crc kubenswrapper[4735]: I1008 13:50:03.623421 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hzvrq"] Oct 08 13:50:04 crc kubenswrapper[4735]: I1008 13:50:04.903682 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" path="/var/lib/kubelet/pods/e0b15269-ff15-4850-bfc4-f0299cdf4819/volumes" Oct 08 13:50:08 crc kubenswrapper[4735]: I1008 13:50:08.894889 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:50:08 crc kubenswrapper[4735]: E1008 13:50:08.895552 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:50:19 crc kubenswrapper[4735]: I1008 13:50:19.895043 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:50:19 crc kubenswrapper[4735]: E1008 13:50:19.896914 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:50:30 crc kubenswrapper[4735]: I1008 13:50:30.901270 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:50:30 crc kubenswrapper[4735]: E1008 13:50:30.902240 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:50:43 crc kubenswrapper[4735]: I1008 13:50:43.895202 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:50:43 crc kubenswrapper[4735]: E1008 13:50:43.896503 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:50:55 crc kubenswrapper[4735]: I1008 13:50:55.894660 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:50:55 crc kubenswrapper[4735]: E1008 13:50:55.895535 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.538184 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9wlwm"] Oct 08 13:50:56 crc kubenswrapper[4735]: E1008 13:50:56.538498 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="extract-content" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.538511 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="extract-content" Oct 08 13:50:56 crc kubenswrapper[4735]: E1008 13:50:56.538521 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="registry-server" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.538528 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="registry-server" Oct 08 13:50:56 crc kubenswrapper[4735]: E1008 13:50:56.538554 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="extract-utilities" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.538561 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="extract-utilities" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.538701 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b15269-ff15-4850-bfc4-f0299cdf4819" containerName="registry-server" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.540744 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.553444 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wlwm"] Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.671589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-842np\" (UniqueName: \"kubernetes.io/projected/01097d93-fcb4-49a7-8c1e-86486affc196-kube-api-access-842np\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.671675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-utilities\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.671752 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-catalog-content\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.773156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-utilities\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.773495 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-catalog-content\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.773628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-utilities\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.773629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-842np\" (UniqueName: \"kubernetes.io/projected/01097d93-fcb4-49a7-8c1e-86486affc196-kube-api-access-842np\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.774047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-catalog-content\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.794606 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-842np\" (UniqueName: \"kubernetes.io/projected/01097d93-fcb4-49a7-8c1e-86486affc196-kube-api-access-842np\") pod \"redhat-operators-9wlwm\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:56 crc kubenswrapper[4735]: I1008 13:50:56.865244 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:50:57 crc kubenswrapper[4735]: I1008 13:50:57.279082 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wlwm"] Oct 08 13:50:57 crc kubenswrapper[4735]: I1008 13:50:57.682864 4735 generic.go:334] "Generic (PLEG): container finished" podID="01097d93-fcb4-49a7-8c1e-86486affc196" containerID="a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65" exitCode=0 Oct 08 13:50:57 crc kubenswrapper[4735]: I1008 13:50:57.682938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wlwm" event={"ID":"01097d93-fcb4-49a7-8c1e-86486affc196","Type":"ContainerDied","Data":"a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65"} Oct 08 13:50:57 crc kubenswrapper[4735]: I1008 13:50:57.683176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wlwm" event={"ID":"01097d93-fcb4-49a7-8c1e-86486affc196","Type":"ContainerStarted","Data":"55bb5578b68f578cc2937fad03bc303dea0a806ca1a383641ee94b7744fd2e82"} Oct 08 13:50:59 crc kubenswrapper[4735]: I1008 13:50:59.700241 4735 generic.go:334] "Generic (PLEG): container finished" podID="01097d93-fcb4-49a7-8c1e-86486affc196" containerID="51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c" exitCode=0 Oct 08 13:50:59 crc kubenswrapper[4735]: I1008 13:50:59.701668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wlwm" event={"ID":"01097d93-fcb4-49a7-8c1e-86486affc196","Type":"ContainerDied","Data":"51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c"} Oct 08 13:51:00 crc kubenswrapper[4735]: I1008 13:51:00.713349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wlwm" event={"ID":"01097d93-fcb4-49a7-8c1e-86486affc196","Type":"ContainerStarted","Data":"4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083"} Oct 08 13:51:00 crc kubenswrapper[4735]: I1008 13:51:00.737109 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9wlwm" podStartSLOduration=2.207054396 podStartE2EDuration="4.737091242s" podCreationTimestamp="2025-10-08 13:50:56 +0000 UTC" firstStartedPulling="2025-10-08 13:50:57.684011097 +0000 UTC m=+3307.492315641" lastFinishedPulling="2025-10-08 13:51:00.214047943 +0000 UTC m=+3310.022352487" observedRunningTime="2025-10-08 13:51:00.732093244 +0000 UTC m=+3310.540397798" watchObservedRunningTime="2025-10-08 13:51:00.737091242 +0000 UTC m=+3310.545395796" Oct 08 13:51:06 crc kubenswrapper[4735]: I1008 13:51:06.866285 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:51:06 crc kubenswrapper[4735]: I1008 13:51:06.866648 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:51:06 crc kubenswrapper[4735]: I1008 13:51:06.914736 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:51:07 crc kubenswrapper[4735]: I1008 13:51:07.799118 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:51:07 crc kubenswrapper[4735]: I1008 13:51:07.839168 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wlwm"] Oct 08 13:51:09 crc kubenswrapper[4735]: I1008 13:51:09.775980 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9wlwm" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="registry-server" containerID="cri-o://4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083" gracePeriod=2 Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.659857 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.767733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-catalog-content\") pod \"01097d93-fcb4-49a7-8c1e-86486affc196\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.767874 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-utilities\") pod \"01097d93-fcb4-49a7-8c1e-86486affc196\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.767915 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-842np\" (UniqueName: \"kubernetes.io/projected/01097d93-fcb4-49a7-8c1e-86486affc196-kube-api-access-842np\") pod \"01097d93-fcb4-49a7-8c1e-86486affc196\" (UID: \"01097d93-fcb4-49a7-8c1e-86486affc196\") " Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.769151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-utilities" (OuterVolumeSpecName: "utilities") pod "01097d93-fcb4-49a7-8c1e-86486affc196" (UID: "01097d93-fcb4-49a7-8c1e-86486affc196"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.774790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01097d93-fcb4-49a7-8c1e-86486affc196-kube-api-access-842np" (OuterVolumeSpecName: "kube-api-access-842np") pod "01097d93-fcb4-49a7-8c1e-86486affc196" (UID: "01097d93-fcb4-49a7-8c1e-86486affc196"). InnerVolumeSpecName "kube-api-access-842np". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.783197 4735 generic.go:334] "Generic (PLEG): container finished" podID="01097d93-fcb4-49a7-8c1e-86486affc196" containerID="4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083" exitCode=0 Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.783277 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wlwm" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.784525 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wlwm" event={"ID":"01097d93-fcb4-49a7-8c1e-86486affc196","Type":"ContainerDied","Data":"4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083"} Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.784663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wlwm" event={"ID":"01097d93-fcb4-49a7-8c1e-86486affc196","Type":"ContainerDied","Data":"55bb5578b68f578cc2937fad03bc303dea0a806ca1a383641ee94b7744fd2e82"} Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.784749 4735 scope.go:117] "RemoveContainer" containerID="4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.820530 4735 scope.go:117] "RemoveContainer" containerID="51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.835788 4735 scope.go:117] "RemoveContainer" containerID="a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.856323 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01097d93-fcb4-49a7-8c1e-86486affc196" (UID: "01097d93-fcb4-49a7-8c1e-86486affc196"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.865566 4735 scope.go:117] "RemoveContainer" containerID="4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083" Oct 08 13:51:10 crc kubenswrapper[4735]: E1008 13:51:10.866055 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083\": container with ID starting with 4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083 not found: ID does not exist" containerID="4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.866120 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083"} err="failed to get container status \"4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083\": rpc error: code = NotFound desc = could not find container \"4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083\": container with ID starting with 4f1f2d210ef6b51e24f0e95315c42c755f04a2b9aaaa2a0d8d0c3bbce4f8c083 not found: ID does not exist" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.866163 4735 scope.go:117] "RemoveContainer" containerID="51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c" Oct 08 13:51:10 crc kubenswrapper[4735]: E1008 13:51:10.866682 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c\": container with ID starting with 51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c not found: ID does not exist" containerID="51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.866712 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c"} err="failed to get container status \"51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c\": rpc error: code = NotFound desc = could not find container \"51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c\": container with ID starting with 51e32f6dcb9c8f22c6b0c11a2927bcceaefcb5556286b6924c6d7a5fc4ae778c not found: ID does not exist" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.866731 4735 scope.go:117] "RemoveContainer" containerID="a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65" Oct 08 13:51:10 crc kubenswrapper[4735]: E1008 13:51:10.867078 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65\": container with ID starting with a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65 not found: ID does not exist" containerID="a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.867125 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65"} err="failed to get container status \"a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65\": rpc error: code = NotFound desc = could not find container \"a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65\": container with ID starting with a9a67355a3d8079c371e2a3fb755e6c208e75cbd0867b5a00f4162ad3d2f1d65 not found: ID does not exist" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.869242 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.869271 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-842np\" (UniqueName: \"kubernetes.io/projected/01097d93-fcb4-49a7-8c1e-86486affc196-kube-api-access-842np\") on node \"crc\" DevicePath \"\"" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.869284 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01097d93-fcb4-49a7-8c1e-86486affc196-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:51:10 crc kubenswrapper[4735]: I1008 13:51:10.898455 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:51:10 crc kubenswrapper[4735]: E1008 13:51:10.898707 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:51:11 crc kubenswrapper[4735]: I1008 13:51:11.110213 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wlwm"] Oct 08 13:51:11 crc kubenswrapper[4735]: I1008 13:51:11.116293 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9wlwm"] Oct 08 13:51:12 crc kubenswrapper[4735]: I1008 13:51:12.911301 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" path="/var/lib/kubelet/pods/01097d93-fcb4-49a7-8c1e-86486affc196/volumes" Oct 08 13:51:23 crc kubenswrapper[4735]: I1008 13:51:23.895142 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:51:23 crc kubenswrapper[4735]: E1008 13:51:23.896189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:51:38 crc kubenswrapper[4735]: I1008 13:51:38.895064 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:51:38 crc kubenswrapper[4735]: E1008 13:51:38.895825 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:51:53 crc kubenswrapper[4735]: I1008 13:51:53.896344 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:51:53 crc kubenswrapper[4735]: E1008 13:51:53.897147 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:52:05 crc kubenswrapper[4735]: I1008 13:52:05.894666 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:52:05 crc kubenswrapper[4735]: E1008 13:52:05.895439 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:52:19 crc kubenswrapper[4735]: I1008 13:52:19.894868 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:52:19 crc kubenswrapper[4735]: E1008 13:52:19.895596 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:52:31 crc kubenswrapper[4735]: I1008 13:52:31.895540 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:52:31 crc kubenswrapper[4735]: E1008 13:52:31.896551 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:52:44 crc kubenswrapper[4735]: I1008 13:52:44.894534 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:52:44 crc kubenswrapper[4735]: E1008 13:52:44.895332 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:52:58 crc kubenswrapper[4735]: I1008 13:52:58.894298 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:52:59 crc kubenswrapper[4735]: I1008 13:52:59.589305 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"66adade894154f5148594423540bbb9f089da97db2df2ad92330740d1619ac44"} Oct 08 13:55:24 crc kubenswrapper[4735]: I1008 13:55:24.878907 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:55:24 crc kubenswrapper[4735]: I1008 13:55:24.879923 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:55:54 crc kubenswrapper[4735]: I1008 13:55:54.878181 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:55:54 crc kubenswrapper[4735]: I1008 13:55:54.878934 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:56:24 crc kubenswrapper[4735]: I1008 13:56:24.878278 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:56:24 crc kubenswrapper[4735]: I1008 13:56:24.880780 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:56:24 crc kubenswrapper[4735]: I1008 13:56:24.880875 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:56:24 crc kubenswrapper[4735]: I1008 13:56:24.881577 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66adade894154f5148594423540bbb9f089da97db2df2ad92330740d1619ac44"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:56:24 crc kubenswrapper[4735]: I1008 13:56:24.881640 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://66adade894154f5148594423540bbb9f089da97db2df2ad92330740d1619ac44" gracePeriod=600 Oct 08 13:56:25 crc kubenswrapper[4735]: I1008 13:56:25.111561 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="66adade894154f5148594423540bbb9f089da97db2df2ad92330740d1619ac44" exitCode=0 Oct 08 13:56:25 crc kubenswrapper[4735]: I1008 13:56:25.111628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"66adade894154f5148594423540bbb9f089da97db2df2ad92330740d1619ac44"} Oct 08 13:56:25 crc kubenswrapper[4735]: I1008 13:56:25.111915 4735 scope.go:117] "RemoveContainer" containerID="4d1ca381527767d6fb57f050b8e50899b4a43d3d083a6f0f021d28e2ef77e261" Oct 08 13:56:26 crc kubenswrapper[4735]: I1008 13:56:26.120967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8"} Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.279546 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9s7mc"] Oct 08 13:56:56 crc kubenswrapper[4735]: E1008 13:56:56.280433 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="registry-server" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.280449 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="registry-server" Oct 08 13:56:56 crc kubenswrapper[4735]: E1008 13:56:56.280535 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="extract-content" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.280545 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="extract-content" Oct 08 13:56:56 crc kubenswrapper[4735]: E1008 13:56:56.280596 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="extract-utilities" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.280605 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="extract-utilities" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.280777 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="01097d93-fcb4-49a7-8c1e-86486affc196" containerName="registry-server" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.281838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.295737 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9s7mc"] Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.480247 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-utilities\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.480295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4w7\" (UniqueName: \"kubernetes.io/projected/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-kube-api-access-dq4w7\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.480341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-catalog-content\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.581674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-utilities\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.581738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4w7\" (UniqueName: \"kubernetes.io/projected/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-kube-api-access-dq4w7\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.581790 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-catalog-content\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.582493 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-catalog-content\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.582493 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-utilities\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.603527 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4w7\" (UniqueName: \"kubernetes.io/projected/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-kube-api-access-dq4w7\") pod \"redhat-marketplace-9s7mc\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:56 crc kubenswrapper[4735]: I1008 13:56:56.899069 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:56:57 crc kubenswrapper[4735]: I1008 13:56:57.352574 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9s7mc"] Oct 08 13:56:58 crc kubenswrapper[4735]: I1008 13:56:58.358282 4735 generic.go:334] "Generic (PLEG): container finished" podID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerID="e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448" exitCode=0 Oct 08 13:56:58 crc kubenswrapper[4735]: I1008 13:56:58.358340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9s7mc" event={"ID":"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4","Type":"ContainerDied","Data":"e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448"} Oct 08 13:56:58 crc kubenswrapper[4735]: I1008 13:56:58.358629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9s7mc" event={"ID":"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4","Type":"ContainerStarted","Data":"ec3f7d27c2e4bf7f856689a3668e521f79fbe1d26b58b87cc8111f7f320d3b4d"} Oct 08 13:56:58 crc kubenswrapper[4735]: I1008 13:56:58.360887 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 13:57:00 crc kubenswrapper[4735]: I1008 13:57:00.386619 4735 generic.go:334] "Generic (PLEG): container finished" podID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerID="b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc" exitCode=0 Oct 08 13:57:00 crc kubenswrapper[4735]: I1008 13:57:00.387139 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9s7mc" event={"ID":"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4","Type":"ContainerDied","Data":"b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc"} Oct 08 13:57:01 crc kubenswrapper[4735]: I1008 13:57:01.397639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9s7mc" event={"ID":"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4","Type":"ContainerStarted","Data":"8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07"} Oct 08 13:57:01 crc kubenswrapper[4735]: I1008 13:57:01.417522 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9s7mc" podStartSLOduration=2.970021657 podStartE2EDuration="5.417501432s" podCreationTimestamp="2025-10-08 13:56:56 +0000 UTC" firstStartedPulling="2025-10-08 13:56:58.360641027 +0000 UTC m=+3668.168945571" lastFinishedPulling="2025-10-08 13:57:00.808120802 +0000 UTC m=+3670.616425346" observedRunningTime="2025-10-08 13:57:01.415261364 +0000 UTC m=+3671.223565928" watchObservedRunningTime="2025-10-08 13:57:01.417501432 +0000 UTC m=+3671.225805966" Oct 08 13:57:06 crc kubenswrapper[4735]: I1008 13:57:06.911545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:57:06 crc kubenswrapper[4735]: I1008 13:57:06.911612 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:57:06 crc kubenswrapper[4735]: I1008 13:57:06.952720 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:57:07 crc kubenswrapper[4735]: I1008 13:57:07.496632 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:57:07 crc kubenswrapper[4735]: I1008 13:57:07.544329 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9s7mc"] Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.455110 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9s7mc" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="registry-server" containerID="cri-o://8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07" gracePeriod=2 Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.851300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.872077 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-catalog-content\") pod \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.872193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq4w7\" (UniqueName: \"kubernetes.io/projected/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-kube-api-access-dq4w7\") pod \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.872247 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-utilities\") pod \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\" (UID: \"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4\") " Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.874007 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-utilities" (OuterVolumeSpecName: "utilities") pod "e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" (UID: "e1e7eec9-2ab8-4741-abd1-8e8cec0302b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.879330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-kube-api-access-dq4w7" (OuterVolumeSpecName: "kube-api-access-dq4w7") pod "e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" (UID: "e1e7eec9-2ab8-4741-abd1-8e8cec0302b4"). InnerVolumeSpecName "kube-api-access-dq4w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.893556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" (UID: "e1e7eec9-2ab8-4741-abd1-8e8cec0302b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.974574 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.974619 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq4w7\" (UniqueName: \"kubernetes.io/projected/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-kube-api-access-dq4w7\") on node \"crc\" DevicePath \"\"" Oct 08 13:57:09 crc kubenswrapper[4735]: I1008 13:57:09.974636 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.463732 4735 generic.go:334] "Generic (PLEG): container finished" podID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerID="8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07" exitCode=0 Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.463784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9s7mc" event={"ID":"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4","Type":"ContainerDied","Data":"8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07"} Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.463817 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9s7mc" event={"ID":"e1e7eec9-2ab8-4741-abd1-8e8cec0302b4","Type":"ContainerDied","Data":"ec3f7d27c2e4bf7f856689a3668e521f79fbe1d26b58b87cc8111f7f320d3b4d"} Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.463839 4735 scope.go:117] "RemoveContainer" containerID="8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.463835 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9s7mc" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.502599 4735 scope.go:117] "RemoveContainer" containerID="b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.507810 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9s7mc"] Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.513591 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9s7mc"] Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.519402 4735 scope.go:117] "RemoveContainer" containerID="e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.540416 4735 scope.go:117] "RemoveContainer" containerID="8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07" Oct 08 13:57:10 crc kubenswrapper[4735]: E1008 13:57:10.540766 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07\": container with ID starting with 8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07 not found: ID does not exist" containerID="8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.540819 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07"} err="failed to get container status \"8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07\": rpc error: code = NotFound desc = could not find container \"8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07\": container with ID starting with 8b6cbe12c19ea981f19a9d7a26c54a4a0c2e8568abf0c3929392660fb08d3c07 not found: ID does not exist" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.540840 4735 scope.go:117] "RemoveContainer" containerID="b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc" Oct 08 13:57:10 crc kubenswrapper[4735]: E1008 13:57:10.541185 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc\": container with ID starting with b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc not found: ID does not exist" containerID="b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.541237 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc"} err="failed to get container status \"b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc\": rpc error: code = NotFound desc = could not find container \"b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc\": container with ID starting with b4d20edfe60014c9f2c07ad3edaeeb431100bd82c3d8f92278ae2823d98430fc not found: ID does not exist" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.541264 4735 scope.go:117] "RemoveContainer" containerID="e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448" Oct 08 13:57:10 crc kubenswrapper[4735]: E1008 13:57:10.541567 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448\": container with ID starting with e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448 not found: ID does not exist" containerID="e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.541602 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448"} err="failed to get container status \"e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448\": rpc error: code = NotFound desc = could not find container \"e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448\": container with ID starting with e2b31f71166a88345d8baff00f65c2e63b918f791d676f6a4f06576aef988448 not found: ID does not exist" Oct 08 13:57:10 crc kubenswrapper[4735]: I1008 13:57:10.904412 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" path="/var/lib/kubelet/pods/e1e7eec9-2ab8-4741-abd1-8e8cec0302b4/volumes" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.741867 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8hnr"] Oct 08 13:57:38 crc kubenswrapper[4735]: E1008 13:57:38.742848 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="extract-content" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.742868 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="extract-content" Oct 08 13:57:38 crc kubenswrapper[4735]: E1008 13:57:38.742918 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="extract-utilities" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.742931 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="extract-utilities" Oct 08 13:57:38 crc kubenswrapper[4735]: E1008 13:57:38.742957 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="registry-server" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.742970 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="registry-server" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.743231 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e7eec9-2ab8-4741-abd1-8e8cec0302b4" containerName="registry-server" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.745025 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.759242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8hnr"] Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.879534 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-utilities\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.879589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-catalog-content\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.879756 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxt2\" (UniqueName: \"kubernetes.io/projected/be9de640-cb77-44f5-9d3e-06b95b7007a3-kube-api-access-wnxt2\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.981672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-utilities\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.982163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-catalog-content\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.982382 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-utilities\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.982585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-catalog-content\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:38 crc kubenswrapper[4735]: I1008 13:57:38.982749 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxt2\" (UniqueName: \"kubernetes.io/projected/be9de640-cb77-44f5-9d3e-06b95b7007a3-kube-api-access-wnxt2\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:39 crc kubenswrapper[4735]: I1008 13:57:39.006062 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxt2\" (UniqueName: \"kubernetes.io/projected/be9de640-cb77-44f5-9d3e-06b95b7007a3-kube-api-access-wnxt2\") pod \"certified-operators-s8hnr\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:39 crc kubenswrapper[4735]: I1008 13:57:39.075505 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:39 crc kubenswrapper[4735]: I1008 13:57:39.566199 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8hnr"] Oct 08 13:57:39 crc kubenswrapper[4735]: I1008 13:57:39.681370 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerStarted","Data":"3688fecd3fda230c9a2d1eb0a8db2e1c2274cdb255f74fff25936d6e1e2307b4"} Oct 08 13:57:40 crc kubenswrapper[4735]: I1008 13:57:40.691083 4735 generic.go:334] "Generic (PLEG): container finished" podID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerID="f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f" exitCode=0 Oct 08 13:57:40 crc kubenswrapper[4735]: I1008 13:57:40.691131 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerDied","Data":"f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f"} Oct 08 13:57:41 crc kubenswrapper[4735]: I1008 13:57:41.700746 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerStarted","Data":"f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0"} Oct 08 13:57:42 crc kubenswrapper[4735]: I1008 13:57:42.725791 4735 generic.go:334] "Generic (PLEG): container finished" podID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerID="f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0" exitCode=0 Oct 08 13:57:42 crc kubenswrapper[4735]: I1008 13:57:42.725863 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerDied","Data":"f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0"} Oct 08 13:57:43 crc kubenswrapper[4735]: I1008 13:57:43.737293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerStarted","Data":"d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988"} Oct 08 13:57:43 crc kubenswrapper[4735]: I1008 13:57:43.762152 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8hnr" podStartSLOduration=3.191268567 podStartE2EDuration="5.762131052s" podCreationTimestamp="2025-10-08 13:57:38 +0000 UTC" firstStartedPulling="2025-10-08 13:57:40.692496369 +0000 UTC m=+3710.500800913" lastFinishedPulling="2025-10-08 13:57:43.263358854 +0000 UTC m=+3713.071663398" observedRunningTime="2025-10-08 13:57:43.756816705 +0000 UTC m=+3713.565121259" watchObservedRunningTime="2025-10-08 13:57:43.762131052 +0000 UTC m=+3713.570435596" Oct 08 13:57:49 crc kubenswrapper[4735]: I1008 13:57:49.076629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:49 crc kubenswrapper[4735]: I1008 13:57:49.077207 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:49 crc kubenswrapper[4735]: I1008 13:57:49.148867 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:49 crc kubenswrapper[4735]: I1008 13:57:49.830588 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:49 crc kubenswrapper[4735]: I1008 13:57:49.872842 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8hnr"] Oct 08 13:57:51 crc kubenswrapper[4735]: I1008 13:57:51.803756 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s8hnr" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="registry-server" containerID="cri-o://d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988" gracePeriod=2 Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.171614 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.311131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxt2\" (UniqueName: \"kubernetes.io/projected/be9de640-cb77-44f5-9d3e-06b95b7007a3-kube-api-access-wnxt2\") pod \"be9de640-cb77-44f5-9d3e-06b95b7007a3\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.311235 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-catalog-content\") pod \"be9de640-cb77-44f5-9d3e-06b95b7007a3\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.311299 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-utilities\") pod \"be9de640-cb77-44f5-9d3e-06b95b7007a3\" (UID: \"be9de640-cb77-44f5-9d3e-06b95b7007a3\") " Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.313124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-utilities" (OuterVolumeSpecName: "utilities") pod "be9de640-cb77-44f5-9d3e-06b95b7007a3" (UID: "be9de640-cb77-44f5-9d3e-06b95b7007a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.321330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9de640-cb77-44f5-9d3e-06b95b7007a3-kube-api-access-wnxt2" (OuterVolumeSpecName: "kube-api-access-wnxt2") pod "be9de640-cb77-44f5-9d3e-06b95b7007a3" (UID: "be9de640-cb77-44f5-9d3e-06b95b7007a3"). InnerVolumeSpecName "kube-api-access-wnxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.383327 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be9de640-cb77-44f5-9d3e-06b95b7007a3" (UID: "be9de640-cb77-44f5-9d3e-06b95b7007a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.413087 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.413117 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9de640-cb77-44f5-9d3e-06b95b7007a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.413130 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxt2\" (UniqueName: \"kubernetes.io/projected/be9de640-cb77-44f5-9d3e-06b95b7007a3-kube-api-access-wnxt2\") on node \"crc\" DevicePath \"\"" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.815887 4735 generic.go:334] "Generic (PLEG): container finished" podID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerID="d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988" exitCode=0 Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.815943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerDied","Data":"d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988"} Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.816021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8hnr" event={"ID":"be9de640-cb77-44f5-9d3e-06b95b7007a3","Type":"ContainerDied","Data":"3688fecd3fda230c9a2d1eb0a8db2e1c2274cdb255f74fff25936d6e1e2307b4"} Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.816025 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8hnr" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.816043 4735 scope.go:117] "RemoveContainer" containerID="d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.840986 4735 scope.go:117] "RemoveContainer" containerID="f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.874235 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8hnr"] Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.882643 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s8hnr"] Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.882750 4735 scope.go:117] "RemoveContainer" containerID="f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.906105 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" path="/var/lib/kubelet/pods/be9de640-cb77-44f5-9d3e-06b95b7007a3/volumes" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.913215 4735 scope.go:117] "RemoveContainer" containerID="d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988" Oct 08 13:57:52 crc kubenswrapper[4735]: E1008 13:57:52.913573 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988\": container with ID starting with d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988 not found: ID does not exist" containerID="d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.913620 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988"} err="failed to get container status \"d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988\": rpc error: code = NotFound desc = could not find container \"d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988\": container with ID starting with d43eb808b0b281f55899089acb98de02e6b3e1d4f3434e37df03af7e4503c988 not found: ID does not exist" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.913653 4735 scope.go:117] "RemoveContainer" containerID="f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0" Oct 08 13:57:52 crc kubenswrapper[4735]: E1008 13:57:52.914215 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0\": container with ID starting with f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0 not found: ID does not exist" containerID="f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.914269 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0"} err="failed to get container status \"f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0\": rpc error: code = NotFound desc = could not find container \"f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0\": container with ID starting with f101a45e1c26ce5a0e856c6af02424a87d41333a464ccb5f8aac1a76db9aa6f0 not found: ID does not exist" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.914317 4735 scope.go:117] "RemoveContainer" containerID="f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f" Oct 08 13:57:52 crc kubenswrapper[4735]: E1008 13:57:52.914751 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f\": container with ID starting with f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f not found: ID does not exist" containerID="f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f" Oct 08 13:57:52 crc kubenswrapper[4735]: I1008 13:57:52.914817 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f"} err="failed to get container status \"f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f\": rpc error: code = NotFound desc = could not find container \"f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f\": container with ID starting with f322f47a642ca51e42c4737abe3960054998bd3d7025fe40eaa820448ee4e65f not found: ID does not exist" Oct 08 13:58:54 crc kubenswrapper[4735]: I1008 13:58:54.879003 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:58:54 crc kubenswrapper[4735]: I1008 13:58:54.881562 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:59:24 crc kubenswrapper[4735]: I1008 13:59:24.878112 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:59:24 crc kubenswrapper[4735]: I1008 13:59:24.878851 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:59:54 crc kubenswrapper[4735]: I1008 13:59:54.878927 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 13:59:54 crc kubenswrapper[4735]: I1008 13:59:54.879625 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 13:59:54 crc kubenswrapper[4735]: I1008 13:59:54.879683 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 13:59:54 crc kubenswrapper[4735]: I1008 13:59:54.880338 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 13:59:54 crc kubenswrapper[4735]: I1008 13:59:54.880401 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" gracePeriod=600 Oct 08 13:59:55 crc kubenswrapper[4735]: E1008 13:59:55.000940 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 13:59:55 crc kubenswrapper[4735]: I1008 13:59:55.732072 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" exitCode=0 Oct 08 13:59:55 crc kubenswrapper[4735]: I1008 13:59:55.732150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8"} Oct 08 13:59:55 crc kubenswrapper[4735]: I1008 13:59:55.732195 4735 scope.go:117] "RemoveContainer" containerID="66adade894154f5148594423540bbb9f089da97db2df2ad92330740d1619ac44" Oct 08 13:59:55 crc kubenswrapper[4735]: I1008 13:59:55.733195 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 13:59:55 crc kubenswrapper[4735]: E1008 13:59:55.733592 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.152217 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq"] Oct 08 14:00:00 crc kubenswrapper[4735]: E1008 14:00:00.153167 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="extract-utilities" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.153183 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="extract-utilities" Oct 08 14:00:00 crc kubenswrapper[4735]: E1008 14:00:00.153199 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="extract-content" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.153207 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="extract-content" Oct 08 14:00:00 crc kubenswrapper[4735]: E1008 14:00:00.153230 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="registry-server" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.153237 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="registry-server" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.153409 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9de640-cb77-44f5-9d3e-06b95b7007a3" containerName="registry-server" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.154023 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.158935 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.159070 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.165726 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq"] Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.313949 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqf5\" (UniqueName: \"kubernetes.io/projected/36157948-f34a-4697-91c1-09f731003616-kube-api-access-cpqf5\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.314132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36157948-f34a-4697-91c1-09f731003616-secret-volume\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.314205 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36157948-f34a-4697-91c1-09f731003616-config-volume\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.415159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqf5\" (UniqueName: \"kubernetes.io/projected/36157948-f34a-4697-91c1-09f731003616-kube-api-access-cpqf5\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.415225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36157948-f34a-4697-91c1-09f731003616-secret-volume\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.415263 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36157948-f34a-4697-91c1-09f731003616-config-volume\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.416266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36157948-f34a-4697-91c1-09f731003616-config-volume\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.425422 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36157948-f34a-4697-91c1-09f731003616-secret-volume\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.432753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqf5\" (UniqueName: \"kubernetes.io/projected/36157948-f34a-4697-91c1-09f731003616-kube-api-access-cpqf5\") pod \"collect-profiles-29332200-s77gq\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.525676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:00 crc kubenswrapper[4735]: I1008 14:00:00.964392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq"] Oct 08 14:00:01 crc kubenswrapper[4735]: I1008 14:00:01.780933 4735 generic.go:334] "Generic (PLEG): container finished" podID="36157948-f34a-4697-91c1-09f731003616" containerID="463c9555e2173c60b80f50c599b3f4f1ef26920c900a0893e0472e7225a9f48d" exitCode=0 Oct 08 14:00:01 crc kubenswrapper[4735]: I1008 14:00:01.781067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" event={"ID":"36157948-f34a-4697-91c1-09f731003616","Type":"ContainerDied","Data":"463c9555e2173c60b80f50c599b3f4f1ef26920c900a0893e0472e7225a9f48d"} Oct 08 14:00:01 crc kubenswrapper[4735]: I1008 14:00:01.781278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" event={"ID":"36157948-f34a-4697-91c1-09f731003616","Type":"ContainerStarted","Data":"869dd50764d95b6f47f19834c1e593ed6cf332596f6301b32af3910172b0aa75"} Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.049688 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.055619 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36157948-f34a-4697-91c1-09f731003616-secret-volume\") pod \"36157948-f34a-4697-91c1-09f731003616\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.055739 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36157948-f34a-4697-91c1-09f731003616-config-volume\") pod \"36157948-f34a-4697-91c1-09f731003616\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.055799 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpqf5\" (UniqueName: \"kubernetes.io/projected/36157948-f34a-4697-91c1-09f731003616-kube-api-access-cpqf5\") pod \"36157948-f34a-4697-91c1-09f731003616\" (UID: \"36157948-f34a-4697-91c1-09f731003616\") " Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.056551 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36157948-f34a-4697-91c1-09f731003616-config-volume" (OuterVolumeSpecName: "config-volume") pod "36157948-f34a-4697-91c1-09f731003616" (UID: "36157948-f34a-4697-91c1-09f731003616"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.061386 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36157948-f34a-4697-91c1-09f731003616-kube-api-access-cpqf5" (OuterVolumeSpecName: "kube-api-access-cpqf5") pod "36157948-f34a-4697-91c1-09f731003616" (UID: "36157948-f34a-4697-91c1-09f731003616"). InnerVolumeSpecName "kube-api-access-cpqf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.062027 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36157948-f34a-4697-91c1-09f731003616-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36157948-f34a-4697-91c1-09f731003616" (UID: "36157948-f34a-4697-91c1-09f731003616"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.157941 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36157948-f34a-4697-91c1-09f731003616-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.157984 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpqf5\" (UniqueName: \"kubernetes.io/projected/36157948-f34a-4697-91c1-09f731003616-kube-api-access-cpqf5\") on node \"crc\" DevicePath \"\"" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.158001 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36157948-f34a-4697-91c1-09f731003616-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.794837 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" event={"ID":"36157948-f34a-4697-91c1-09f731003616","Type":"ContainerDied","Data":"869dd50764d95b6f47f19834c1e593ed6cf332596f6301b32af3910172b0aa75"} Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.794878 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="869dd50764d95b6f47f19834c1e593ed6cf332596f6301b32af3910172b0aa75" Oct 08 14:00:03 crc kubenswrapper[4735]: I1008 14:00:03.794890 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq" Oct 08 14:00:04 crc kubenswrapper[4735]: I1008 14:00:04.123339 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2"] Oct 08 14:00:04 crc kubenswrapper[4735]: I1008 14:00:04.127867 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332155-s58r2"] Oct 08 14:00:04 crc kubenswrapper[4735]: I1008 14:00:04.903148 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd65ea9-bdfb-4d1d-b09e-816693069b07" path="/var/lib/kubelet/pods/9dd65ea9-bdfb-4d1d-b09e-816693069b07/volumes" Oct 08 14:00:06 crc kubenswrapper[4735]: I1008 14:00:06.894895 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:00:06 crc kubenswrapper[4735]: E1008 14:00:06.895246 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:00:07 crc kubenswrapper[4735]: I1008 14:00:07.206931 4735 scope.go:117] "RemoveContainer" containerID="272e4b3b7ef5916e8d77733e1bc56a06e768dc5c9767d56714b089a41fe76265" Oct 08 14:00:19 crc kubenswrapper[4735]: I1008 14:00:19.895198 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:00:19 crc kubenswrapper[4735]: E1008 14:00:19.895914 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:00:34 crc kubenswrapper[4735]: I1008 14:00:34.895153 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:00:34 crc kubenswrapper[4735]: E1008 14:00:34.896748 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:00:49 crc kubenswrapper[4735]: I1008 14:00:49.894065 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:00:49 crc kubenswrapper[4735]: E1008 14:00:49.895963 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:01:02 crc kubenswrapper[4735]: I1008 14:01:02.895015 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:01:02 crc kubenswrapper[4735]: E1008 14:01:02.895833 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:01:16 crc kubenswrapper[4735]: I1008 14:01:16.895493 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:01:16 crc kubenswrapper[4735]: E1008 14:01:16.896190 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:01:31 crc kubenswrapper[4735]: I1008 14:01:31.895398 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:01:31 crc kubenswrapper[4735]: E1008 14:01:31.896697 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:01:46 crc kubenswrapper[4735]: I1008 14:01:46.895658 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:01:46 crc kubenswrapper[4735]: E1008 14:01:46.896306 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:01:59 crc kubenswrapper[4735]: I1008 14:01:59.894655 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:01:59 crc kubenswrapper[4735]: E1008 14:01:59.895386 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:02:10 crc kubenswrapper[4735]: I1008 14:02:10.898681 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:02:10 crc kubenswrapper[4735]: E1008 14:02:10.899370 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:02:22 crc kubenswrapper[4735]: I1008 14:02:22.897182 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:02:22 crc kubenswrapper[4735]: E1008 14:02:22.897994 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:02:36 crc kubenswrapper[4735]: I1008 14:02:36.894649 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:02:36 crc kubenswrapper[4735]: E1008 14:02:36.895686 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:02:48 crc kubenswrapper[4735]: I1008 14:02:48.894944 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:02:48 crc kubenswrapper[4735]: E1008 14:02:48.895735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.664622 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l74dp"] Oct 08 14:02:56 crc kubenswrapper[4735]: E1008 14:02:56.665550 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36157948-f34a-4697-91c1-09f731003616" containerName="collect-profiles" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.665566 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="36157948-f34a-4697-91c1-09f731003616" containerName="collect-profiles" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.665753 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="36157948-f34a-4697-91c1-09f731003616" containerName="collect-profiles" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.668276 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.681615 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l74dp"] Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.764447 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-catalog-content\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.764582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-utilities\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.764628 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h89rv\" (UniqueName: \"kubernetes.io/projected/e8feb0d6-8290-40e4-8b57-60891a4d8649-kube-api-access-h89rv\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.866227 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-catalog-content\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.866321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-utilities\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.866348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h89rv\" (UniqueName: \"kubernetes.io/projected/e8feb0d6-8290-40e4-8b57-60891a4d8649-kube-api-access-h89rv\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.866880 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-catalog-content\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.867039 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-utilities\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.885455 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h89rv\" (UniqueName: \"kubernetes.io/projected/e8feb0d6-8290-40e4-8b57-60891a4d8649-kube-api-access-h89rv\") pod \"community-operators-l74dp\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:56 crc kubenswrapper[4735]: I1008 14:02:56.985832 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:02:57 crc kubenswrapper[4735]: I1008 14:02:57.460105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l74dp"] Oct 08 14:02:58 crc kubenswrapper[4735]: I1008 14:02:58.144290 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerID="ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e" exitCode=0 Oct 08 14:02:58 crc kubenswrapper[4735]: I1008 14:02:58.144635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerDied","Data":"ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e"} Oct 08 14:02:58 crc kubenswrapper[4735]: I1008 14:02:58.144669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerStarted","Data":"bff2e55b7d595ef7b1db5169c56d224e5e54d5ceb1baa9cd706eb2e2702df443"} Oct 08 14:02:58 crc kubenswrapper[4735]: I1008 14:02:58.147264 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.072052 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4krmc"] Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.073610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.080238 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4krmc"] Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.111791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911cfef2-58be-47e0-bd73-c87f31296d38-utilities\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.111875 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911cfef2-58be-47e0-bd73-c87f31296d38-catalog-content\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.111962 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vp5c\" (UniqueName: \"kubernetes.io/projected/911cfef2-58be-47e0-bd73-c87f31296d38-kube-api-access-8vp5c\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.157178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerStarted","Data":"4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f"} Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.213785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911cfef2-58be-47e0-bd73-c87f31296d38-utilities\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.214096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911cfef2-58be-47e0-bd73-c87f31296d38-catalog-content\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.214205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vp5c\" (UniqueName: \"kubernetes.io/projected/911cfef2-58be-47e0-bd73-c87f31296d38-kube-api-access-8vp5c\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.214691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911cfef2-58be-47e0-bd73-c87f31296d38-utilities\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.214765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911cfef2-58be-47e0-bd73-c87f31296d38-catalog-content\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.235568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vp5c\" (UniqueName: \"kubernetes.io/projected/911cfef2-58be-47e0-bd73-c87f31296d38-kube-api-access-8vp5c\") pod \"redhat-operators-4krmc\" (UID: \"911cfef2-58be-47e0-bd73-c87f31296d38\") " pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.399392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:02:59 crc kubenswrapper[4735]: I1008 14:02:59.894881 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:02:59 crc kubenswrapper[4735]: E1008 14:02:59.895662 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:03:00 crc kubenswrapper[4735]: I1008 14:03:00.119983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4krmc"] Oct 08 14:03:00 crc kubenswrapper[4735]: W1008 14:03:00.128950 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911cfef2_58be_47e0_bd73_c87f31296d38.slice/crio-de856f69e040f9d2858fd5024ed135f81396cbc87122e81bc480bc044a03777e WatchSource:0}: Error finding container de856f69e040f9d2858fd5024ed135f81396cbc87122e81bc480bc044a03777e: Status 404 returned error can't find the container with id de856f69e040f9d2858fd5024ed135f81396cbc87122e81bc480bc044a03777e Oct 08 14:03:00 crc kubenswrapper[4735]: I1008 14:03:00.165716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4krmc" event={"ID":"911cfef2-58be-47e0-bd73-c87f31296d38","Type":"ContainerStarted","Data":"de856f69e040f9d2858fd5024ed135f81396cbc87122e81bc480bc044a03777e"} Oct 08 14:03:00 crc kubenswrapper[4735]: I1008 14:03:00.168263 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerID="4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f" exitCode=0 Oct 08 14:03:00 crc kubenswrapper[4735]: I1008 14:03:00.168308 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerDied","Data":"4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f"} Oct 08 14:03:01 crc kubenswrapper[4735]: I1008 14:03:01.179402 4735 generic.go:334] "Generic (PLEG): container finished" podID="911cfef2-58be-47e0-bd73-c87f31296d38" containerID="25a107a0fea1a6878e7daf92ec08ad21fe95a8f2ca67d183760039b779b39fed" exitCode=0 Oct 08 14:03:01 crc kubenswrapper[4735]: I1008 14:03:01.179497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4krmc" event={"ID":"911cfef2-58be-47e0-bd73-c87f31296d38","Type":"ContainerDied","Data":"25a107a0fea1a6878e7daf92ec08ad21fe95a8f2ca67d183760039b779b39fed"} Oct 08 14:03:03 crc kubenswrapper[4735]: I1008 14:03:03.198322 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerStarted","Data":"a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381"} Oct 08 14:03:03 crc kubenswrapper[4735]: I1008 14:03:03.221545 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l74dp" podStartSLOduration=3.016144365 podStartE2EDuration="7.221527589s" podCreationTimestamp="2025-10-08 14:02:56 +0000 UTC" firstStartedPulling="2025-10-08 14:02:58.146926431 +0000 UTC m=+4027.955230985" lastFinishedPulling="2025-10-08 14:03:02.352309665 +0000 UTC m=+4032.160614209" observedRunningTime="2025-10-08 14:03:03.221357394 +0000 UTC m=+4033.029661948" watchObservedRunningTime="2025-10-08 14:03:03.221527589 +0000 UTC m=+4033.029832133" Oct 08 14:03:06 crc kubenswrapper[4735]: I1008 14:03:06.986071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:03:06 crc kubenswrapper[4735]: I1008 14:03:06.986395 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:03:07 crc kubenswrapper[4735]: I1008 14:03:07.027267 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:03:07 crc kubenswrapper[4735]: I1008 14:03:07.270149 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:03:07 crc kubenswrapper[4735]: I1008 14:03:07.454748 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l74dp"] Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.240920 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4krmc" event={"ID":"911cfef2-58be-47e0-bd73-c87f31296d38","Type":"ContainerStarted","Data":"208d7656f7fc6cfe4c89f1bd8ce2530fc0fe1f742f86279feea8b824aa926307"} Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.241052 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l74dp" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="registry-server" containerID="cri-o://a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381" gracePeriod=2 Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.610589 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.755453 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h89rv\" (UniqueName: \"kubernetes.io/projected/e8feb0d6-8290-40e4-8b57-60891a4d8649-kube-api-access-h89rv\") pod \"e8feb0d6-8290-40e4-8b57-60891a4d8649\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.755564 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-utilities\") pod \"e8feb0d6-8290-40e4-8b57-60891a4d8649\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.755624 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-catalog-content\") pod \"e8feb0d6-8290-40e4-8b57-60891a4d8649\" (UID: \"e8feb0d6-8290-40e4-8b57-60891a4d8649\") " Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.757341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-utilities" (OuterVolumeSpecName: "utilities") pod "e8feb0d6-8290-40e4-8b57-60891a4d8649" (UID: "e8feb0d6-8290-40e4-8b57-60891a4d8649"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.764767 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8feb0d6-8290-40e4-8b57-60891a4d8649-kube-api-access-h89rv" (OuterVolumeSpecName: "kube-api-access-h89rv") pod "e8feb0d6-8290-40e4-8b57-60891a4d8649" (UID: "e8feb0d6-8290-40e4-8b57-60891a4d8649"). InnerVolumeSpecName "kube-api-access-h89rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.803422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8feb0d6-8290-40e4-8b57-60891a4d8649" (UID: "e8feb0d6-8290-40e4-8b57-60891a4d8649"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.857221 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h89rv\" (UniqueName: \"kubernetes.io/projected/e8feb0d6-8290-40e4-8b57-60891a4d8649-kube-api-access-h89rv\") on node \"crc\" DevicePath \"\"" Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.857287 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:03:09 crc kubenswrapper[4735]: I1008 14:03:09.857304 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8feb0d6-8290-40e4-8b57-60891a4d8649-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.254956 4735 generic.go:334] "Generic (PLEG): container finished" podID="911cfef2-58be-47e0-bd73-c87f31296d38" containerID="208d7656f7fc6cfe4c89f1bd8ce2530fc0fe1f742f86279feea8b824aa926307" exitCode=0 Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.255065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4krmc" event={"ID":"911cfef2-58be-47e0-bd73-c87f31296d38","Type":"ContainerDied","Data":"208d7656f7fc6cfe4c89f1bd8ce2530fc0fe1f742f86279feea8b824aa926307"} Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.259658 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerID="a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381" exitCode=0 Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.259706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerDied","Data":"a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381"} Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.259741 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l74dp" event={"ID":"e8feb0d6-8290-40e4-8b57-60891a4d8649","Type":"ContainerDied","Data":"bff2e55b7d595ef7b1db5169c56d224e5e54d5ceb1baa9cd706eb2e2702df443"} Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.259774 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l74dp" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.259783 4735 scope.go:117] "RemoveContainer" containerID="a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.284832 4735 scope.go:117] "RemoveContainer" containerID="4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.305316 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l74dp"] Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.314829 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l74dp"] Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.319212 4735 scope.go:117] "RemoveContainer" containerID="ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.340128 4735 scope.go:117] "RemoveContainer" containerID="a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381" Oct 08 14:03:10 crc kubenswrapper[4735]: E1008 14:03:10.340590 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381\": container with ID starting with a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381 not found: ID does not exist" containerID="a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.340637 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381"} err="failed to get container status \"a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381\": rpc error: code = NotFound desc = could not find container \"a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381\": container with ID starting with a8d90d44cd98c77a82c288e2961e06d43840ad7ec3bbc64c8881502d465d1381 not found: ID does not exist" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.340662 4735 scope.go:117] "RemoveContainer" containerID="4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f" Oct 08 14:03:10 crc kubenswrapper[4735]: E1008 14:03:10.341180 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f\": container with ID starting with 4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f not found: ID does not exist" containerID="4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.341229 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f"} err="failed to get container status \"4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f\": rpc error: code = NotFound desc = could not find container \"4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f\": container with ID starting with 4c27b93912a77a36a2e8eaa17f89f9771cc8242b5551b951bb9c359dc52d968f not found: ID does not exist" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.341262 4735 scope.go:117] "RemoveContainer" containerID="ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e" Oct 08 14:03:10 crc kubenswrapper[4735]: E1008 14:03:10.341549 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e\": container with ID starting with ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e not found: ID does not exist" containerID="ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.341585 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e"} err="failed to get container status \"ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e\": rpc error: code = NotFound desc = could not find container \"ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e\": container with ID starting with ffee806b9d2a69f13076b9a6a4811785413da0ce4dbbd5a4f352daf065b9cd9e not found: ID does not exist" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.900206 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:03:10 crc kubenswrapper[4735]: E1008 14:03:10.900738 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:03:10 crc kubenswrapper[4735]: I1008 14:03:10.904909 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" path="/var/lib/kubelet/pods/e8feb0d6-8290-40e4-8b57-60891a4d8649/volumes" Oct 08 14:03:11 crc kubenswrapper[4735]: I1008 14:03:11.271349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4krmc" event={"ID":"911cfef2-58be-47e0-bd73-c87f31296d38","Type":"ContainerStarted","Data":"da6083ff23efd7dad9bf63c8d423bf2a6bbbc60b95ca169b0be9c591068788c6"} Oct 08 14:03:11 crc kubenswrapper[4735]: I1008 14:03:11.294363 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4krmc" podStartSLOduration=2.519914306 podStartE2EDuration="12.294338017s" podCreationTimestamp="2025-10-08 14:02:59 +0000 UTC" firstStartedPulling="2025-10-08 14:03:01.181838036 +0000 UTC m=+4030.990142580" lastFinishedPulling="2025-10-08 14:03:10.956261747 +0000 UTC m=+4040.764566291" observedRunningTime="2025-10-08 14:03:11.291126334 +0000 UTC m=+4041.099430898" watchObservedRunningTime="2025-10-08 14:03:11.294338017 +0000 UTC m=+4041.102642571" Oct 08 14:03:19 crc kubenswrapper[4735]: I1008 14:03:19.400122 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:03:19 crc kubenswrapper[4735]: I1008 14:03:19.400750 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:03:19 crc kubenswrapper[4735]: I1008 14:03:19.448564 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:03:20 crc kubenswrapper[4735]: I1008 14:03:20.628505 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4krmc" Oct 08 14:03:20 crc kubenswrapper[4735]: I1008 14:03:20.694225 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4krmc"] Oct 08 14:03:20 crc kubenswrapper[4735]: I1008 14:03:20.735997 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-md7br"] Oct 08 14:03:20 crc kubenswrapper[4735]: I1008 14:03:20.736267 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-md7br" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="registry-server" containerID="cri-o://d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf" gracePeriod=2 Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.168941 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.326738 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-catalog-content\") pod \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.326787 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-utilities\") pod \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.326858 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ptdc\" (UniqueName: \"kubernetes.io/projected/0f52b3f8-152e-440b-bfe6-67c2f10a1506-kube-api-access-8ptdc\") pod \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\" (UID: \"0f52b3f8-152e-440b-bfe6-67c2f10a1506\") " Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.328170 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-utilities" (OuterVolumeSpecName: "utilities") pod "0f52b3f8-152e-440b-bfe6-67c2f10a1506" (UID: "0f52b3f8-152e-440b-bfe6-67c2f10a1506"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.333017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f52b3f8-152e-440b-bfe6-67c2f10a1506-kube-api-access-8ptdc" (OuterVolumeSpecName: "kube-api-access-8ptdc") pod "0f52b3f8-152e-440b-bfe6-67c2f10a1506" (UID: "0f52b3f8-152e-440b-bfe6-67c2f10a1506"). InnerVolumeSpecName "kube-api-access-8ptdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.357024 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerID="d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf" exitCode=0 Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.357082 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-md7br" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.357122 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerDied","Data":"d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf"} Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.357185 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-md7br" event={"ID":"0f52b3f8-152e-440b-bfe6-67c2f10a1506","Type":"ContainerDied","Data":"0a8b163b9e1d53e860adef1a2dce9ace9204513d065b2611d83147baeb6e7a45"} Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.357209 4735 scope.go:117] "RemoveContainer" containerID="d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.379848 4735 scope.go:117] "RemoveContainer" containerID="99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.399317 4735 scope.go:117] "RemoveContainer" containerID="97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.399452 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f52b3f8-152e-440b-bfe6-67c2f10a1506" (UID: "0f52b3f8-152e-440b-bfe6-67c2f10a1506"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.428656 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.428691 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f52b3f8-152e-440b-bfe6-67c2f10a1506-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.428705 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ptdc\" (UniqueName: \"kubernetes.io/projected/0f52b3f8-152e-440b-bfe6-67c2f10a1506-kube-api-access-8ptdc\") on node \"crc\" DevicePath \"\"" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.430293 4735 scope.go:117] "RemoveContainer" containerID="d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf" Oct 08 14:03:21 crc kubenswrapper[4735]: E1008 14:03:21.430868 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf\": container with ID starting with d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf not found: ID does not exist" containerID="d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.430900 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf"} err="failed to get container status \"d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf\": rpc error: code = NotFound desc = could not find container \"d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf\": container with ID starting with d95556624a84aae8efce8dc8a476bf99a1084c91855e2d5d97d0ed525faa6fcf not found: ID does not exist" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.430920 4735 scope.go:117] "RemoveContainer" containerID="99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85" Oct 08 14:03:21 crc kubenswrapper[4735]: E1008 14:03:21.431259 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85\": container with ID starting with 99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85 not found: ID does not exist" containerID="99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.431290 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85"} err="failed to get container status \"99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85\": rpc error: code = NotFound desc = could not find container \"99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85\": container with ID starting with 99c085b66f55042f247abd23932e9898647ac3b0a43a5ac3b4e9a1bde66e0c85 not found: ID does not exist" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.431309 4735 scope.go:117] "RemoveContainer" containerID="97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98" Oct 08 14:03:21 crc kubenswrapper[4735]: E1008 14:03:21.431599 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98\": container with ID starting with 97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98 not found: ID does not exist" containerID="97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.431629 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98"} err="failed to get container status \"97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98\": rpc error: code = NotFound desc = could not find container \"97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98\": container with ID starting with 97fa8b57e24cc577de12aae9f3ef016e00336c7f5a40d9e6c3ca859ad4107a98 not found: ID does not exist" Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.689553 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-md7br"] Oct 08 14:03:21 crc kubenswrapper[4735]: I1008 14:03:21.694801 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-md7br"] Oct 08 14:03:21 crc kubenswrapper[4735]: E1008 14:03:21.760097 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f52b3f8_152e_440b_bfe6_67c2f10a1506.slice\": RecentStats: unable to find data in memory cache]" Oct 08 14:03:22 crc kubenswrapper[4735]: I1008 14:03:22.895411 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:03:22 crc kubenswrapper[4735]: E1008 14:03:22.895701 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:03:22 crc kubenswrapper[4735]: I1008 14:03:22.904376 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" path="/var/lib/kubelet/pods/0f52b3f8-152e-440b-bfe6-67c2f10a1506/volumes" Oct 08 14:03:35 crc kubenswrapper[4735]: I1008 14:03:35.895347 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:03:35 crc kubenswrapper[4735]: E1008 14:03:35.896422 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:03:48 crc kubenswrapper[4735]: I1008 14:03:48.894930 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:03:48 crc kubenswrapper[4735]: E1008 14:03:48.895651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:04:01 crc kubenswrapper[4735]: I1008 14:04:01.895583 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:04:01 crc kubenswrapper[4735]: E1008 14:04:01.896518 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:04:14 crc kubenswrapper[4735]: I1008 14:04:14.895144 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:04:14 crc kubenswrapper[4735]: E1008 14:04:14.895840 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:04:28 crc kubenswrapper[4735]: I1008 14:04:28.894714 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:04:28 crc kubenswrapper[4735]: E1008 14:04:28.896648 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:04:39 crc kubenswrapper[4735]: I1008 14:04:39.895106 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:04:39 crc kubenswrapper[4735]: E1008 14:04:39.895859 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:04:50 crc kubenswrapper[4735]: I1008 14:04:50.898458 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:04:50 crc kubenswrapper[4735]: E1008 14:04:50.900001 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:05:03 crc kubenswrapper[4735]: I1008 14:05:03.895539 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:05:04 crc kubenswrapper[4735]: I1008 14:05:04.233856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"07da46df0e41345d762cf55f706128b878996990638e93a6b0837fa6a4d118c5"} Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.281083 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6ls5w"] Oct 08 14:07:00 crc kubenswrapper[4735]: E1008 14:07:00.281964 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="extract-content" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.281980 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="extract-content" Oct 08 14:07:00 crc kubenswrapper[4735]: E1008 14:07:00.281997 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="extract-utilities" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282005 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="extract-utilities" Oct 08 14:07:00 crc kubenswrapper[4735]: E1008 14:07:00.282021 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="extract-utilities" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282029 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="extract-utilities" Oct 08 14:07:00 crc kubenswrapper[4735]: E1008 14:07:00.282039 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="registry-server" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282048 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="registry-server" Oct 08 14:07:00 crc kubenswrapper[4735]: E1008 14:07:00.282072 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="extract-content" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282079 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="extract-content" Oct 08 14:07:00 crc kubenswrapper[4735]: E1008 14:07:00.282108 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="registry-server" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282116 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="registry-server" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282271 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f52b3f8-152e-440b-bfe6-67c2f10a1506" containerName="registry-server" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.282293 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8feb0d6-8290-40e4-8b57-60891a4d8649" containerName="registry-server" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.283553 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.292734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ls5w"] Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.309743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-utilities\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.309798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhsrz\" (UniqueName: \"kubernetes.io/projected/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-kube-api-access-rhsrz\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.309864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-catalog-content\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.411557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-catalog-content\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.411671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-utilities\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.411710 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhsrz\" (UniqueName: \"kubernetes.io/projected/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-kube-api-access-rhsrz\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.412162 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-utilities\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.412160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-catalog-content\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.431923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhsrz\" (UniqueName: \"kubernetes.io/projected/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-kube-api-access-rhsrz\") pod \"redhat-marketplace-6ls5w\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:00 crc kubenswrapper[4735]: I1008 14:07:00.610588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:01 crc kubenswrapper[4735]: I1008 14:07:01.028788 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ls5w"] Oct 08 14:07:01 crc kubenswrapper[4735]: I1008 14:07:01.147188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ls5w" event={"ID":"33889ff5-23f9-43d4-b1e7-cfb68c2276e4","Type":"ContainerStarted","Data":"65a017ddd446c83c3fa3f6c17721cb647ac58cf0fd39a905827f1c532a8a3fc2"} Oct 08 14:07:02 crc kubenswrapper[4735]: I1008 14:07:02.156505 4735 generic.go:334] "Generic (PLEG): container finished" podID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerID="72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b" exitCode=0 Oct 08 14:07:02 crc kubenswrapper[4735]: I1008 14:07:02.156553 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ls5w" event={"ID":"33889ff5-23f9-43d4-b1e7-cfb68c2276e4","Type":"ContainerDied","Data":"72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b"} Oct 08 14:07:04 crc kubenswrapper[4735]: I1008 14:07:04.176775 4735 generic.go:334] "Generic (PLEG): container finished" podID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerID="c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711" exitCode=0 Oct 08 14:07:04 crc kubenswrapper[4735]: I1008 14:07:04.176857 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ls5w" event={"ID":"33889ff5-23f9-43d4-b1e7-cfb68c2276e4","Type":"ContainerDied","Data":"c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711"} Oct 08 14:07:05 crc kubenswrapper[4735]: I1008 14:07:05.188486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ls5w" event={"ID":"33889ff5-23f9-43d4-b1e7-cfb68c2276e4","Type":"ContainerStarted","Data":"e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61"} Oct 08 14:07:05 crc kubenswrapper[4735]: I1008 14:07:05.215447 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6ls5w" podStartSLOduration=2.750037848 podStartE2EDuration="5.215418393s" podCreationTimestamp="2025-10-08 14:07:00 +0000 UTC" firstStartedPulling="2025-10-08 14:07:02.158878492 +0000 UTC m=+4271.967183046" lastFinishedPulling="2025-10-08 14:07:04.624259047 +0000 UTC m=+4274.432563591" observedRunningTime="2025-10-08 14:07:05.206947505 +0000 UTC m=+4275.015252109" watchObservedRunningTime="2025-10-08 14:07:05.215418393 +0000 UTC m=+4275.023722937" Oct 08 14:07:10 crc kubenswrapper[4735]: I1008 14:07:10.611296 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:10 crc kubenswrapper[4735]: I1008 14:07:10.611906 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:10 crc kubenswrapper[4735]: I1008 14:07:10.657125 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:11 crc kubenswrapper[4735]: I1008 14:07:11.835538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:11 crc kubenswrapper[4735]: I1008 14:07:11.900963 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ls5w"] Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.245566 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6ls5w" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="registry-server" containerID="cri-o://e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61" gracePeriod=2 Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.654176 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.704267 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhsrz\" (UniqueName: \"kubernetes.io/projected/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-kube-api-access-rhsrz\") pod \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.704353 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-utilities\") pod \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.704391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-catalog-content\") pod \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\" (UID: \"33889ff5-23f9-43d4-b1e7-cfb68c2276e4\") " Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.705544 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-utilities" (OuterVolumeSpecName: "utilities") pod "33889ff5-23f9-43d4-b1e7-cfb68c2276e4" (UID: "33889ff5-23f9-43d4-b1e7-cfb68c2276e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.707141 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.713933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-kube-api-access-rhsrz" (OuterVolumeSpecName: "kube-api-access-rhsrz") pod "33889ff5-23f9-43d4-b1e7-cfb68c2276e4" (UID: "33889ff5-23f9-43d4-b1e7-cfb68c2276e4"). InnerVolumeSpecName "kube-api-access-rhsrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.724690 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33889ff5-23f9-43d4-b1e7-cfb68c2276e4" (UID: "33889ff5-23f9-43d4-b1e7-cfb68c2276e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.808569 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhsrz\" (UniqueName: \"kubernetes.io/projected/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-kube-api-access-rhsrz\") on node \"crc\" DevicePath \"\"" Oct 08 14:07:13 crc kubenswrapper[4735]: I1008 14:07:13.808613 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33889ff5-23f9-43d4-b1e7-cfb68c2276e4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.256653 4735 generic.go:334] "Generic (PLEG): container finished" podID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerID="e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61" exitCode=0 Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.256689 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ls5w" event={"ID":"33889ff5-23f9-43d4-b1e7-cfb68c2276e4","Type":"ContainerDied","Data":"e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61"} Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.256712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ls5w" event={"ID":"33889ff5-23f9-43d4-b1e7-cfb68c2276e4","Type":"ContainerDied","Data":"65a017ddd446c83c3fa3f6c17721cb647ac58cf0fd39a905827f1c532a8a3fc2"} Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.256725 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ls5w" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.256732 4735 scope.go:117] "RemoveContainer" containerID="e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.281874 4735 scope.go:117] "RemoveContainer" containerID="c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.291294 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ls5w"] Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.297524 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ls5w"] Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.310221 4735 scope.go:117] "RemoveContainer" containerID="72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.342606 4735 scope.go:117] "RemoveContainer" containerID="e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61" Oct 08 14:07:14 crc kubenswrapper[4735]: E1008 14:07:14.343074 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61\": container with ID starting with e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61 not found: ID does not exist" containerID="e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.343138 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61"} err="failed to get container status \"e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61\": rpc error: code = NotFound desc = could not find container \"e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61\": container with ID starting with e802c2b8f58e9b9ac2809a1ed201472fce74cea972375341b70303cd9b23aa61 not found: ID does not exist" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.343170 4735 scope.go:117] "RemoveContainer" containerID="c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711" Oct 08 14:07:14 crc kubenswrapper[4735]: E1008 14:07:14.343697 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711\": container with ID starting with c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711 not found: ID does not exist" containerID="c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.343729 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711"} err="failed to get container status \"c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711\": rpc error: code = NotFound desc = could not find container \"c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711\": container with ID starting with c90d0f3cce23ec066281c93100e58cbd7e6ac531352e9ed63ed3ee779d96d711 not found: ID does not exist" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.343770 4735 scope.go:117] "RemoveContainer" containerID="72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b" Oct 08 14:07:14 crc kubenswrapper[4735]: E1008 14:07:14.344075 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b\": container with ID starting with 72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b not found: ID does not exist" containerID="72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.344102 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b"} err="failed to get container status \"72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b\": rpc error: code = NotFound desc = could not find container \"72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b\": container with ID starting with 72b0120857ecc659feb081f4dcd12c172b301c0fc25687a1168b4a0f880e7c5b not found: ID does not exist" Oct 08 14:07:14 crc kubenswrapper[4735]: I1008 14:07:14.903412 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" path="/var/lib/kubelet/pods/33889ff5-23f9-43d4-b1e7-cfb68c2276e4/volumes" Oct 08 14:07:24 crc kubenswrapper[4735]: I1008 14:07:24.879183 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:07:24 crc kubenswrapper[4735]: I1008 14:07:24.879667 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.713798 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wc7hn"] Oct 08 14:07:51 crc kubenswrapper[4735]: E1008 14:07:51.714894 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="extract-content" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.714918 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="extract-content" Oct 08 14:07:51 crc kubenswrapper[4735]: E1008 14:07:51.714948 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="extract-utilities" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.714962 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="extract-utilities" Oct 08 14:07:51 crc kubenswrapper[4735]: E1008 14:07:51.714989 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="registry-server" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.715002 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="registry-server" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.715297 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="33889ff5-23f9-43d4-b1e7-cfb68c2276e4" containerName="registry-server" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.716934 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.754330 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wc7hn"] Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.871751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh55f\" (UniqueName: \"kubernetes.io/projected/497e6679-c95b-41a8-bd50-cb845a1add47-kube-api-access-gh55f\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.871958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/497e6679-c95b-41a8-bd50-cb845a1add47-catalog-content\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.872008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/497e6679-c95b-41a8-bd50-cb845a1add47-utilities\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.973929 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh55f\" (UniqueName: \"kubernetes.io/projected/497e6679-c95b-41a8-bd50-cb845a1add47-kube-api-access-gh55f\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.974638 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/497e6679-c95b-41a8-bd50-cb845a1add47-catalog-content\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.974682 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/497e6679-c95b-41a8-bd50-cb845a1add47-utilities\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.975147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/497e6679-c95b-41a8-bd50-cb845a1add47-utilities\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:51 crc kubenswrapper[4735]: I1008 14:07:51.975181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/497e6679-c95b-41a8-bd50-cb845a1add47-catalog-content\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:52 crc kubenswrapper[4735]: I1008 14:07:51.995321 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh55f\" (UniqueName: \"kubernetes.io/projected/497e6679-c95b-41a8-bd50-cb845a1add47-kube-api-access-gh55f\") pod \"certified-operators-wc7hn\" (UID: \"497e6679-c95b-41a8-bd50-cb845a1add47\") " pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:52 crc kubenswrapper[4735]: I1008 14:07:52.084872 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:07:52 crc kubenswrapper[4735]: I1008 14:07:52.669210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wc7hn"] Oct 08 14:07:53 crc kubenswrapper[4735]: I1008 14:07:53.575890 4735 generic.go:334] "Generic (PLEG): container finished" podID="497e6679-c95b-41a8-bd50-cb845a1add47" containerID="912b32e0591030d00809409b363a13462b5d5cbcb281715446030216ac645f3a" exitCode=0 Oct 08 14:07:53 crc kubenswrapper[4735]: I1008 14:07:53.575938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc7hn" event={"ID":"497e6679-c95b-41a8-bd50-cb845a1add47","Type":"ContainerDied","Data":"912b32e0591030d00809409b363a13462b5d5cbcb281715446030216ac645f3a"} Oct 08 14:07:53 crc kubenswrapper[4735]: I1008 14:07:53.577841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc7hn" event={"ID":"497e6679-c95b-41a8-bd50-cb845a1add47","Type":"ContainerStarted","Data":"a0979d42d141d14b42a1332dafd93111098ada01ebe45200647462de2e7d4c09"} Oct 08 14:07:54 crc kubenswrapper[4735]: I1008 14:07:54.878876 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:07:54 crc kubenswrapper[4735]: I1008 14:07:54.879196 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:07:58 crc kubenswrapper[4735]: I1008 14:07:58.619499 4735 generic.go:334] "Generic (PLEG): container finished" podID="497e6679-c95b-41a8-bd50-cb845a1add47" containerID="c08e260fc28b97a48c0ac887dd52dab2782ec638eecbadfdc0fde29a1379a27a" exitCode=0 Oct 08 14:07:58 crc kubenswrapper[4735]: I1008 14:07:58.619646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc7hn" event={"ID":"497e6679-c95b-41a8-bd50-cb845a1add47","Type":"ContainerDied","Data":"c08e260fc28b97a48c0ac887dd52dab2782ec638eecbadfdc0fde29a1379a27a"} Oct 08 14:07:58 crc kubenswrapper[4735]: I1008 14:07:58.621843 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:07:59 crc kubenswrapper[4735]: I1008 14:07:59.631369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc7hn" event={"ID":"497e6679-c95b-41a8-bd50-cb845a1add47","Type":"ContainerStarted","Data":"210ca9e631d65c81995143ca232f66480ed7460a18f432383a20fb23bff1b897"} Oct 08 14:07:59 crc kubenswrapper[4735]: I1008 14:07:59.657557 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wc7hn" podStartSLOduration=3.206003745 podStartE2EDuration="8.657540481s" podCreationTimestamp="2025-10-08 14:07:51 +0000 UTC" firstStartedPulling="2025-10-08 14:07:53.577570541 +0000 UTC m=+4323.385875095" lastFinishedPulling="2025-10-08 14:07:59.029107267 +0000 UTC m=+4328.837411831" observedRunningTime="2025-10-08 14:07:59.654703428 +0000 UTC m=+4329.463007992" watchObservedRunningTime="2025-10-08 14:07:59.657540481 +0000 UTC m=+4329.465845045" Oct 08 14:08:02 crc kubenswrapper[4735]: I1008 14:08:02.085402 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:08:02 crc kubenswrapper[4735]: I1008 14:08:02.085779 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:08:02 crc kubenswrapper[4735]: I1008 14:08:02.130652 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.155401 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wc7hn" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.224884 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wc7hn"] Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.265767 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.266049 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rrq6x" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="registry-server" containerID="cri-o://773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205" gracePeriod=2 Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.667955 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.720805 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-utilities\") pod \"8d70ee3a-74c8-44e6-99d1-b558d7315963\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.720986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxqp4\" (UniqueName: \"kubernetes.io/projected/8d70ee3a-74c8-44e6-99d1-b558d7315963-kube-api-access-fxqp4\") pod \"8d70ee3a-74c8-44e6-99d1-b558d7315963\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.721033 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-catalog-content\") pod \"8d70ee3a-74c8-44e6-99d1-b558d7315963\" (UID: \"8d70ee3a-74c8-44e6-99d1-b558d7315963\") " Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.721603 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-utilities" (OuterVolumeSpecName: "utilities") pod "8d70ee3a-74c8-44e6-99d1-b558d7315963" (UID: "8d70ee3a-74c8-44e6-99d1-b558d7315963"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.725949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d70ee3a-74c8-44e6-99d1-b558d7315963-kube-api-access-fxqp4" (OuterVolumeSpecName: "kube-api-access-fxqp4") pod "8d70ee3a-74c8-44e6-99d1-b558d7315963" (UID: "8d70ee3a-74c8-44e6-99d1-b558d7315963"). InnerVolumeSpecName "kube-api-access-fxqp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.743933 4735 generic.go:334] "Generic (PLEG): container finished" podID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerID="773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205" exitCode=0 Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.744027 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrq6x" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.744029 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrq6x" event={"ID":"8d70ee3a-74c8-44e6-99d1-b558d7315963","Type":"ContainerDied","Data":"773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205"} Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.744598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrq6x" event={"ID":"8d70ee3a-74c8-44e6-99d1-b558d7315963","Type":"ContainerDied","Data":"5505ddf88d9cac821f93d8cde38cfedcf3da0dd1d3f1d8555cb8cea54fe6d89f"} Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.744623 4735 scope.go:117] "RemoveContainer" containerID="773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.769811 4735 scope.go:117] "RemoveContainer" containerID="5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.787596 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d70ee3a-74c8-44e6-99d1-b558d7315963" (UID: "8d70ee3a-74c8-44e6-99d1-b558d7315963"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.796591 4735 scope.go:117] "RemoveContainer" containerID="2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.822785 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.822811 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d70ee3a-74c8-44e6-99d1-b558d7315963-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.822821 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxqp4\" (UniqueName: \"kubernetes.io/projected/8d70ee3a-74c8-44e6-99d1-b558d7315963-kube-api-access-fxqp4\") on node \"crc\" DevicePath \"\"" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.826921 4735 scope.go:117] "RemoveContainer" containerID="773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205" Oct 08 14:08:12 crc kubenswrapper[4735]: E1008 14:08:12.828994 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205\": container with ID starting with 773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205 not found: ID does not exist" containerID="773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.829055 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205"} err="failed to get container status \"773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205\": rpc error: code = NotFound desc = could not find container \"773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205\": container with ID starting with 773be8b907c6ee1d600d7dea366a7a2be8c7ce737d63df6e2242dff0f969b205 not found: ID does not exist" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.829077 4735 scope.go:117] "RemoveContainer" containerID="5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95" Oct 08 14:08:12 crc kubenswrapper[4735]: E1008 14:08:12.829266 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95\": container with ID starting with 5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95 not found: ID does not exist" containerID="5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.829299 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95"} err="failed to get container status \"5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95\": rpc error: code = NotFound desc = could not find container \"5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95\": container with ID starting with 5a53b43f869c92b12d0295f8c7b7f6be3936dd0ca5f99d7fc82efd2fabf84b95 not found: ID does not exist" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.829312 4735 scope.go:117] "RemoveContainer" containerID="2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4" Oct 08 14:08:12 crc kubenswrapper[4735]: E1008 14:08:12.830083 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4\": container with ID starting with 2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4 not found: ID does not exist" containerID="2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4" Oct 08 14:08:12 crc kubenswrapper[4735]: I1008 14:08:12.830104 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4"} err="failed to get container status \"2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4\": rpc error: code = NotFound desc = could not find container \"2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4\": container with ID starting with 2994a30a9434cfcd1d31f0f1f0fd73f4d9a4a8eb4521c21970416448ce1f9ea4 not found: ID does not exist" Oct 08 14:08:13 crc kubenswrapper[4735]: I1008 14:08:13.065949 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 14:08:13 crc kubenswrapper[4735]: I1008 14:08:13.072208 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rrq6x"] Oct 08 14:08:14 crc kubenswrapper[4735]: I1008 14:08:14.903373 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" path="/var/lib/kubelet/pods/8d70ee3a-74c8-44e6-99d1-b558d7315963/volumes" Oct 08 14:08:24 crc kubenswrapper[4735]: I1008 14:08:24.878872 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:08:24 crc kubenswrapper[4735]: I1008 14:08:24.879446 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:08:24 crc kubenswrapper[4735]: I1008 14:08:24.879533 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:08:24 crc kubenswrapper[4735]: I1008 14:08:24.880376 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07da46df0e41345d762cf55f706128b878996990638e93a6b0837fa6a4d118c5"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:08:24 crc kubenswrapper[4735]: I1008 14:08:24.880470 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://07da46df0e41345d762cf55f706128b878996990638e93a6b0837fa6a4d118c5" gracePeriod=600 Oct 08 14:08:25 crc kubenswrapper[4735]: I1008 14:08:25.852150 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="07da46df0e41345d762cf55f706128b878996990638e93a6b0837fa6a4d118c5" exitCode=0 Oct 08 14:08:25 crc kubenswrapper[4735]: I1008 14:08:25.852236 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"07da46df0e41345d762cf55f706128b878996990638e93a6b0837fa6a4d118c5"} Oct 08 14:08:25 crc kubenswrapper[4735]: I1008 14:08:25.852702 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf"} Oct 08 14:08:25 crc kubenswrapper[4735]: I1008 14:08:25.852747 4735 scope.go:117] "RemoveContainer" containerID="1b85e2fef173dcccb95dff7094d06daa381225d3e83968f8698fb1f4a0882dc8" Oct 08 14:10:54 crc kubenswrapper[4735]: I1008 14:10:54.878930 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:10:54 crc kubenswrapper[4735]: I1008 14:10:54.879738 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.830204 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tjn9c"] Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.835699 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tjn9c"] Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.908786 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1a4018-c324-47f4-b8af-f23613c1257e" path="/var/lib/kubelet/pods/6c1a4018-c324-47f4-b8af-f23613c1257e/volumes" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.957733 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-g2mlx"] Oct 08 14:11:02 crc kubenswrapper[4735]: E1008 14:11:02.958111 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="extract-content" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.958147 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="extract-content" Oct 08 14:11:02 crc kubenswrapper[4735]: E1008 14:11:02.958157 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="registry-server" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.958163 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="registry-server" Oct 08 14:11:02 crc kubenswrapper[4735]: E1008 14:11:02.958200 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="extract-utilities" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.958206 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="extract-utilities" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.958330 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d70ee3a-74c8-44e6-99d1-b558d7315963" containerName="registry-server" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.958844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.963941 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.968348 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.969672 4735 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dllms" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.969766 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 14:11:02 crc kubenswrapper[4735]: I1008 14:11:02.971042 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-g2mlx"] Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.063973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-crc-storage\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.064348 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-node-mnt\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.064571 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d4pz\" (UniqueName: \"kubernetes.io/projected/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-kube-api-access-9d4pz\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.165545 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-node-mnt\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.165614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d4pz\" (UniqueName: \"kubernetes.io/projected/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-kube-api-access-9d4pz\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.165668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-crc-storage\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.165831 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-node-mnt\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.166549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-crc-storage\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.184131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d4pz\" (UniqueName: \"kubernetes.io/projected/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-kube-api-access-9d4pz\") pod \"crc-storage-crc-g2mlx\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.282500 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:03 crc kubenswrapper[4735]: I1008 14:11:03.678078 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-g2mlx"] Oct 08 14:11:04 crc kubenswrapper[4735]: I1008 14:11:04.200976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g2mlx" event={"ID":"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd","Type":"ContainerStarted","Data":"3d3775b2f60c072de08d7b80bcef08ba12499db68c83c52d12fa0a1b52e7e8fc"} Oct 08 14:11:05 crc kubenswrapper[4735]: I1008 14:11:05.211675 4735 generic.go:334] "Generic (PLEG): container finished" podID="0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" containerID="c2c67fc7fbfe356c42708fd91c43a7a1206958e1b8a49de7303c7cf69c1732db" exitCode=0 Oct 08 14:11:05 crc kubenswrapper[4735]: I1008 14:11:05.211752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g2mlx" event={"ID":"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd","Type":"ContainerDied","Data":"c2c67fc7fbfe356c42708fd91c43a7a1206958e1b8a49de7303c7cf69c1732db"} Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.610169 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.712073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-crc-storage\") pod \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.712176 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-node-mnt\") pod \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.712208 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d4pz\" (UniqueName: \"kubernetes.io/projected/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-kube-api-access-9d4pz\") pod \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\" (UID: \"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd\") " Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.712313 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" (UID: "0f343f9c-1ffd-4944-8c81-d58f1e7f57cd"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.712433 4735 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.717738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-kube-api-access-9d4pz" (OuterVolumeSpecName: "kube-api-access-9d4pz") pod "0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" (UID: "0f343f9c-1ffd-4944-8c81-d58f1e7f57cd"). InnerVolumeSpecName "kube-api-access-9d4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.748339 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" (UID: "0f343f9c-1ffd-4944-8c81-d58f1e7f57cd"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.813456 4735 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 14:11:06 crc kubenswrapper[4735]: I1008 14:11:06.813535 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d4pz\" (UniqueName: \"kubernetes.io/projected/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd-kube-api-access-9d4pz\") on node \"crc\" DevicePath \"\"" Oct 08 14:11:07 crc kubenswrapper[4735]: I1008 14:11:07.229391 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g2mlx" event={"ID":"0f343f9c-1ffd-4944-8c81-d58f1e7f57cd","Type":"ContainerDied","Data":"3d3775b2f60c072de08d7b80bcef08ba12499db68c83c52d12fa0a1b52e7e8fc"} Oct 08 14:11:07 crc kubenswrapper[4735]: I1008 14:11:07.229436 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d3775b2f60c072de08d7b80bcef08ba12499db68c83c52d12fa0a1b52e7e8fc" Oct 08 14:11:07 crc kubenswrapper[4735]: I1008 14:11:07.229499 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g2mlx" Oct 08 14:11:07 crc kubenswrapper[4735]: I1008 14:11:07.541426 4735 scope.go:117] "RemoveContainer" containerID="bfbbfc5ac340ef793cb286610eff933f0fa63c6bcc8b318d43d5bd49c977ea59" Oct 08 14:11:08 crc kubenswrapper[4735]: I1008 14:11:08.914631 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-g2mlx"] Oct 08 14:11:08 crc kubenswrapper[4735]: I1008 14:11:08.921381 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-g2mlx"] Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.020696 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-sxh8v"] Oct 08 14:11:09 crc kubenswrapper[4735]: E1008 14:11:09.021091 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" containerName="storage" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.021109 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" containerName="storage" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.021309 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" containerName="storage" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.021831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.025065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dllms" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.025243 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.025369 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.025610 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.027272 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sxh8v"] Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.148053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1680a7e8-e9d2-4f10-868e-18ae64995b36-node-mnt\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.148156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1680a7e8-e9d2-4f10-868e-18ae64995b36-crc-storage\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.148202 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd47t\" (UniqueName: \"kubernetes.io/projected/1680a7e8-e9d2-4f10-868e-18ae64995b36-kube-api-access-vd47t\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.249570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1680a7e8-e9d2-4f10-868e-18ae64995b36-node-mnt\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.250006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1680a7e8-e9d2-4f10-868e-18ae64995b36-crc-storage\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.250196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd47t\" (UniqueName: \"kubernetes.io/projected/1680a7e8-e9d2-4f10-868e-18ae64995b36-kube-api-access-vd47t\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.250047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1680a7e8-e9d2-4f10-868e-18ae64995b36-node-mnt\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.251272 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1680a7e8-e9d2-4f10-868e-18ae64995b36-crc-storage\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.574343 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd47t\" (UniqueName: \"kubernetes.io/projected/1680a7e8-e9d2-4f10-868e-18ae64995b36-kube-api-access-vd47t\") pod \"crc-storage-crc-sxh8v\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.643271 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:09 crc kubenswrapper[4735]: I1008 14:11:09.885217 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sxh8v"] Oct 08 14:11:10 crc kubenswrapper[4735]: I1008 14:11:10.254876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sxh8v" event={"ID":"1680a7e8-e9d2-4f10-868e-18ae64995b36","Type":"ContainerStarted","Data":"f753abd3dba4920a73222f0ec785ab29af442b5a192616abca8d011cad9f6cc7"} Oct 08 14:11:10 crc kubenswrapper[4735]: I1008 14:11:10.906621 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f343f9c-1ffd-4944-8c81-d58f1e7f57cd" path="/var/lib/kubelet/pods/0f343f9c-1ffd-4944-8c81-d58f1e7f57cd/volumes" Oct 08 14:11:11 crc kubenswrapper[4735]: I1008 14:11:11.262851 4735 generic.go:334] "Generic (PLEG): container finished" podID="1680a7e8-e9d2-4f10-868e-18ae64995b36" containerID="71699fceef5150b29e9393de050293c129b7ffcf9add0a48ae6697806bb8caa0" exitCode=0 Oct 08 14:11:11 crc kubenswrapper[4735]: I1008 14:11:11.262893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sxh8v" event={"ID":"1680a7e8-e9d2-4f10-868e-18ae64995b36","Type":"ContainerDied","Data":"71699fceef5150b29e9393de050293c129b7ffcf9add0a48ae6697806bb8caa0"} Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.540441 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.697871 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1680a7e8-e9d2-4f10-868e-18ae64995b36-node-mnt\") pod \"1680a7e8-e9d2-4f10-868e-18ae64995b36\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.697999 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1680a7e8-e9d2-4f10-868e-18ae64995b36-crc-storage\") pod \"1680a7e8-e9d2-4f10-868e-18ae64995b36\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.698022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1680a7e8-e9d2-4f10-868e-18ae64995b36-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1680a7e8-e9d2-4f10-868e-18ae64995b36" (UID: "1680a7e8-e9d2-4f10-868e-18ae64995b36"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.698137 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd47t\" (UniqueName: \"kubernetes.io/projected/1680a7e8-e9d2-4f10-868e-18ae64995b36-kube-api-access-vd47t\") pod \"1680a7e8-e9d2-4f10-868e-18ae64995b36\" (UID: \"1680a7e8-e9d2-4f10-868e-18ae64995b36\") " Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.698469 4735 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1680a7e8-e9d2-4f10-868e-18ae64995b36-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.703732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1680a7e8-e9d2-4f10-868e-18ae64995b36-kube-api-access-vd47t" (OuterVolumeSpecName: "kube-api-access-vd47t") pod "1680a7e8-e9d2-4f10-868e-18ae64995b36" (UID: "1680a7e8-e9d2-4f10-868e-18ae64995b36"). InnerVolumeSpecName "kube-api-access-vd47t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.724458 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1680a7e8-e9d2-4f10-868e-18ae64995b36-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1680a7e8-e9d2-4f10-868e-18ae64995b36" (UID: "1680a7e8-e9d2-4f10-868e-18ae64995b36"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.799779 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd47t\" (UniqueName: \"kubernetes.io/projected/1680a7e8-e9d2-4f10-868e-18ae64995b36-kube-api-access-vd47t\") on node \"crc\" DevicePath \"\"" Oct 08 14:11:12 crc kubenswrapper[4735]: I1008 14:11:12.799823 4735 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1680a7e8-e9d2-4f10-868e-18ae64995b36-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 14:11:13 crc kubenswrapper[4735]: I1008 14:11:13.285396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sxh8v" event={"ID":"1680a7e8-e9d2-4f10-868e-18ae64995b36","Type":"ContainerDied","Data":"f753abd3dba4920a73222f0ec785ab29af442b5a192616abca8d011cad9f6cc7"} Oct 08 14:11:13 crc kubenswrapper[4735]: I1008 14:11:13.285658 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f753abd3dba4920a73222f0ec785ab29af442b5a192616abca8d011cad9f6cc7" Oct 08 14:11:13 crc kubenswrapper[4735]: I1008 14:11:13.285677 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sxh8v" Oct 08 14:11:24 crc kubenswrapper[4735]: I1008 14:11:24.878377 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:11:24 crc kubenswrapper[4735]: I1008 14:11:24.879013 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:11:54 crc kubenswrapper[4735]: I1008 14:11:54.878763 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:11:54 crc kubenswrapper[4735]: I1008 14:11:54.879293 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:11:54 crc kubenswrapper[4735]: I1008 14:11:54.879334 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:11:54 crc kubenswrapper[4735]: I1008 14:11:54.879894 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:11:54 crc kubenswrapper[4735]: I1008 14:11:54.879971 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" gracePeriod=600 Oct 08 14:11:55 crc kubenswrapper[4735]: E1008 14:11:55.016344 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:11:55 crc kubenswrapper[4735]: I1008 14:11:55.632524 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" exitCode=0 Oct 08 14:11:55 crc kubenswrapper[4735]: I1008 14:11:55.632534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf"} Oct 08 14:11:55 crc kubenswrapper[4735]: I1008 14:11:55.632719 4735 scope.go:117] "RemoveContainer" containerID="07da46df0e41345d762cf55f706128b878996990638e93a6b0837fa6a4d118c5" Oct 08 14:11:55 crc kubenswrapper[4735]: I1008 14:11:55.633351 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:11:55 crc kubenswrapper[4735]: E1008 14:11:55.633850 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:12:08 crc kubenswrapper[4735]: I1008 14:12:08.895568 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:12:08 crc kubenswrapper[4735]: E1008 14:12:08.896500 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:12:20 crc kubenswrapper[4735]: I1008 14:12:20.898825 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:12:20 crc kubenswrapper[4735]: E1008 14:12:20.899673 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:12:35 crc kubenswrapper[4735]: I1008 14:12:35.894876 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:12:35 crc kubenswrapper[4735]: E1008 14:12:35.895875 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:12:48 crc kubenswrapper[4735]: I1008 14:12:48.894833 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:12:48 crc kubenswrapper[4735]: E1008 14:12:48.895820 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:13:03 crc kubenswrapper[4735]: I1008 14:13:03.894961 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:13:03 crc kubenswrapper[4735]: E1008 14:13:03.895743 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:13:17 crc kubenswrapper[4735]: I1008 14:13:17.894692 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:13:17 crc kubenswrapper[4735]: E1008 14:13:17.896339 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:13:28 crc kubenswrapper[4735]: I1008 14:13:28.895617 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:13:28 crc kubenswrapper[4735]: E1008 14:13:28.896296 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:13:40 crc kubenswrapper[4735]: I1008 14:13:40.899417 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:13:40 crc kubenswrapper[4735]: E1008 14:13:40.900196 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.046834 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d54fx"] Oct 08 14:13:46 crc kubenswrapper[4735]: E1008 14:13:46.048194 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1680a7e8-e9d2-4f10-868e-18ae64995b36" containerName="storage" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.048224 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1680a7e8-e9d2-4f10-868e-18ae64995b36" containerName="storage" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.048923 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1680a7e8-e9d2-4f10-868e-18ae64995b36" containerName="storage" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.050987 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.069899 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d54fx"] Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.152736 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbk6\" (UniqueName: \"kubernetes.io/projected/a170e5a3-8637-4765-aab8-3bed23b49229-kube-api-access-swbk6\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.152781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-utilities\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.152824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-catalog-content\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.253605 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swbk6\" (UniqueName: \"kubernetes.io/projected/a170e5a3-8637-4765-aab8-3bed23b49229-kube-api-access-swbk6\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.253668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-utilities\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.253724 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-catalog-content\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.254241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-catalog-content\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.254349 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-utilities\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.276621 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbk6\" (UniqueName: \"kubernetes.io/projected/a170e5a3-8637-4765-aab8-3bed23b49229-kube-api-access-swbk6\") pod \"community-operators-d54fx\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.393487 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:46 crc kubenswrapper[4735]: I1008 14:13:46.912075 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d54fx"] Oct 08 14:13:47 crc kubenswrapper[4735]: I1008 14:13:47.582204 4735 generic.go:334] "Generic (PLEG): container finished" podID="a170e5a3-8637-4765-aab8-3bed23b49229" containerID="46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a" exitCode=0 Oct 08 14:13:47 crc kubenswrapper[4735]: I1008 14:13:47.582265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerDied","Data":"46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a"} Oct 08 14:13:47 crc kubenswrapper[4735]: I1008 14:13:47.582290 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerStarted","Data":"fbd2bf75ab4335ba34905109ee9a264be18c037293872b4179748f1a5bbe602f"} Oct 08 14:13:47 crc kubenswrapper[4735]: I1008 14:13:47.584604 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:13:48 crc kubenswrapper[4735]: I1008 14:13:48.590133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerStarted","Data":"1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195"} Oct 08 14:13:49 crc kubenswrapper[4735]: I1008 14:13:49.599265 4735 generic.go:334] "Generic (PLEG): container finished" podID="a170e5a3-8637-4765-aab8-3bed23b49229" containerID="1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195" exitCode=0 Oct 08 14:13:49 crc kubenswrapper[4735]: I1008 14:13:49.599310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerDied","Data":"1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195"} Oct 08 14:13:50 crc kubenswrapper[4735]: I1008 14:13:50.608149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerStarted","Data":"aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809"} Oct 08 14:13:50 crc kubenswrapper[4735]: I1008 14:13:50.629847 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d54fx" podStartSLOduration=2.028970108 podStartE2EDuration="4.629830866s" podCreationTimestamp="2025-10-08 14:13:46 +0000 UTC" firstStartedPulling="2025-10-08 14:13:47.58436181 +0000 UTC m=+4677.392666354" lastFinishedPulling="2025-10-08 14:13:50.185222568 +0000 UTC m=+4679.993527112" observedRunningTime="2025-10-08 14:13:50.627046395 +0000 UTC m=+4680.435350929" watchObservedRunningTime="2025-10-08 14:13:50.629830866 +0000 UTC m=+4680.438135400" Oct 08 14:13:51 crc kubenswrapper[4735]: I1008 14:13:51.894768 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:13:51 crc kubenswrapper[4735]: E1008 14:13:51.896117 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:13:56 crc kubenswrapper[4735]: I1008 14:13:56.394112 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:56 crc kubenswrapper[4735]: I1008 14:13:56.394387 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:56 crc kubenswrapper[4735]: I1008 14:13:56.447591 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:56 crc kubenswrapper[4735]: I1008 14:13:56.703811 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:56 crc kubenswrapper[4735]: I1008 14:13:56.755169 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d54fx"] Oct 08 14:13:58 crc kubenswrapper[4735]: I1008 14:13:58.662559 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d54fx" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="registry-server" containerID="cri-o://aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809" gracePeriod=2 Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.564333 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.664635 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-utilities\") pod \"a170e5a3-8637-4765-aab8-3bed23b49229\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.664788 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-catalog-content\") pod \"a170e5a3-8637-4765-aab8-3bed23b49229\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.664877 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swbk6\" (UniqueName: \"kubernetes.io/projected/a170e5a3-8637-4765-aab8-3bed23b49229-kube-api-access-swbk6\") pod \"a170e5a3-8637-4765-aab8-3bed23b49229\" (UID: \"a170e5a3-8637-4765-aab8-3bed23b49229\") " Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.664880 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-utilities" (OuterVolumeSpecName: "utilities") pod "a170e5a3-8637-4765-aab8-3bed23b49229" (UID: "a170e5a3-8637-4765-aab8-3bed23b49229"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.665410 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.669915 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a170e5a3-8637-4765-aab8-3bed23b49229-kube-api-access-swbk6" (OuterVolumeSpecName: "kube-api-access-swbk6") pod "a170e5a3-8637-4765-aab8-3bed23b49229" (UID: "a170e5a3-8637-4765-aab8-3bed23b49229"). InnerVolumeSpecName "kube-api-access-swbk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.674642 4735 generic.go:334] "Generic (PLEG): container finished" podID="a170e5a3-8637-4765-aab8-3bed23b49229" containerID="aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809" exitCode=0 Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.674681 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerDied","Data":"aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809"} Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.674725 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d54fx" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.674746 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d54fx" event={"ID":"a170e5a3-8637-4765-aab8-3bed23b49229","Type":"ContainerDied","Data":"fbd2bf75ab4335ba34905109ee9a264be18c037293872b4179748f1a5bbe602f"} Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.674766 4735 scope.go:117] "RemoveContainer" containerID="aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.700933 4735 scope.go:117] "RemoveContainer" containerID="1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.721946 4735 scope.go:117] "RemoveContainer" containerID="46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.735660 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a170e5a3-8637-4765-aab8-3bed23b49229" (UID: "a170e5a3-8637-4765-aab8-3bed23b49229"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.754097 4735 scope.go:117] "RemoveContainer" containerID="aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809" Oct 08 14:13:59 crc kubenswrapper[4735]: E1008 14:13:59.754991 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809\": container with ID starting with aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809 not found: ID does not exist" containerID="aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.755041 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809"} err="failed to get container status \"aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809\": rpc error: code = NotFound desc = could not find container \"aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809\": container with ID starting with aa9e09a2605790972de939bd57d30ab06504a1f198ddefc370d134afe6a4c809 not found: ID does not exist" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.755077 4735 scope.go:117] "RemoveContainer" containerID="1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195" Oct 08 14:13:59 crc kubenswrapper[4735]: E1008 14:13:59.755569 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195\": container with ID starting with 1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195 not found: ID does not exist" containerID="1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.755700 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195"} err="failed to get container status \"1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195\": rpc error: code = NotFound desc = could not find container \"1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195\": container with ID starting with 1a1dd00d707f7c537853878a02affc1c4ffb2ecb081d040cc0ee84b0b969d195 not found: ID does not exist" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.755808 4735 scope.go:117] "RemoveContainer" containerID="46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a" Oct 08 14:13:59 crc kubenswrapper[4735]: E1008 14:13:59.756205 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a\": container with ID starting with 46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a not found: ID does not exist" containerID="46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.756247 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a"} err="failed to get container status \"46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a\": rpc error: code = NotFound desc = could not find container \"46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a\": container with ID starting with 46abb81cfe28643579c67af3269472342b0af8956780c26360aecba3f9b3ce4a not found: ID does not exist" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.767258 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a170e5a3-8637-4765-aab8-3bed23b49229-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:13:59 crc kubenswrapper[4735]: I1008 14:13:59.767290 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swbk6\" (UniqueName: \"kubernetes.io/projected/a170e5a3-8637-4765-aab8-3bed23b49229-kube-api-access-swbk6\") on node \"crc\" DevicePath \"\"" Oct 08 14:14:00 crc kubenswrapper[4735]: I1008 14:14:00.016709 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d54fx"] Oct 08 14:14:00 crc kubenswrapper[4735]: I1008 14:14:00.022984 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d54fx"] Oct 08 14:14:00 crc kubenswrapper[4735]: I1008 14:14:00.911961 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" path="/var/lib/kubelet/pods/a170e5a3-8637-4765-aab8-3bed23b49229/volumes" Oct 08 14:14:05 crc kubenswrapper[4735]: I1008 14:14:05.894461 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:14:05 crc kubenswrapper[4735]: E1008 14:14:05.895517 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.970242 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b7964457-kccps"] Oct 08 14:14:16 crc kubenswrapper[4735]: E1008 14:14:16.970910 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="registry-server" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.970922 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="registry-server" Oct 08 14:14:16 crc kubenswrapper[4735]: E1008 14:14:16.970941 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="extract-content" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.970948 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="extract-content" Oct 08 14:14:16 crc kubenswrapper[4735]: E1008 14:14:16.970970 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="extract-utilities" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.970976 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="extract-utilities" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.971111 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a170e5a3-8637-4765-aab8-3bed23b49229" containerName="registry-server" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.971797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.973458 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.975393 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.975407 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-mdqv2" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.976920 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.976984 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 14:14:16 crc kubenswrapper[4735]: I1008 14:14:16.985237 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-kccps"] Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.105740 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-config\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.105970 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcj6\" (UniqueName: \"kubernetes.io/projected/b65dc460-560f-4390-82ee-d72f76482e30-kube-api-access-hfcj6\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.106012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-dns-svc\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.208142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcj6\" (UniqueName: \"kubernetes.io/projected/b65dc460-560f-4390-82ee-d72f76482e30-kube-api-access-hfcj6\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.208195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-dns-svc\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.208265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-config\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.209531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-config\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.210520 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-dns-svc\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.234501 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcj6\" (UniqueName: \"kubernetes.io/projected/b65dc460-560f-4390-82ee-d72f76482e30-kube-api-access-hfcj6\") pod \"dnsmasq-dns-8b7964457-kccps\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.269721 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-scc2x"] Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.271178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.288262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.289322 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-scc2x"] Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.418704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-dns-svc\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.419115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-config\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.419159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfpl\" (UniqueName: \"kubernetes.io/projected/1d10fc66-a842-42a1-9f2b-7d28359712f0-kube-api-access-ctfpl\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.522525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-dns-svc\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.522624 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-config\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.522701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfpl\" (UniqueName: \"kubernetes.io/projected/1d10fc66-a842-42a1-9f2b-7d28359712f0-kube-api-access-ctfpl\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.524430 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-config\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.524516 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-dns-svc\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.586135 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfpl\" (UniqueName: \"kubernetes.io/projected/1d10fc66-a842-42a1-9f2b-7d28359712f0-kube-api-access-ctfpl\") pod \"dnsmasq-dns-67d9f7fb89-scc2x\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.617776 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.869016 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-kccps"] Oct 08 14:14:17 crc kubenswrapper[4735]: W1008 14:14:17.882652 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb65dc460_560f_4390_82ee_d72f76482e30.slice/crio-40db99e0b64315c4c1aaf297075e0dc2333480571f9d2ef5c072b6dc1f61a871 WatchSource:0}: Error finding container 40db99e0b64315c4c1aaf297075e0dc2333480571f9d2ef5c072b6dc1f61a871: Status 404 returned error can't find the container with id 40db99e0b64315c4c1aaf297075e0dc2333480571f9d2ef5c072b6dc1f61a871 Oct 08 14:14:17 crc kubenswrapper[4735]: I1008 14:14:17.900167 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-scc2x"] Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.138304 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.139776 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.143345 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.143366 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.143407 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.143506 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.146792 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x2hvh" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.150823 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233429 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233447 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233470 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233506 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.233592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mh9\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-kube-api-access-52mh9\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.334797 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335202 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335227 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335258 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mh9\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-kube-api-access-52mh9\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.335794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.336032 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.337029 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.337396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.339026 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.339161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.339161 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.339235 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d521554e4737493212cb4c3d77bb11e31c05cf0de2d3b8b216d60249a9c8b916/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.346068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.357385 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mh9\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-kube-api-access-52mh9\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.364790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.463412 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.464537 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.466728 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.467077 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.467377 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.468602 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.468811 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-s9kh2" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.488279 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.512793 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539022 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09b05bff-5dbc-4129-9e21-92b234846931-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539285 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09b05bff-5dbc-4129-9e21-92b234846931-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539315 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngv28\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-kube-api-access-ngv28\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539363 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539428 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.539547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.640697 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09b05bff-5dbc-4129-9e21-92b234846931-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.640780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.640809 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09b05bff-5dbc-4129-9e21-92b234846931-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.640834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.640858 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngv28\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-kube-api-access-ngv28\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.640885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.641030 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.641084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.641115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.642283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.642732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.642762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.643181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.645131 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.645162 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff4eb5c2b518539e744c469d8195362bbdfa451f171055ff1192f9fd3f4a9c48/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.645721 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09b05bff-5dbc-4129-9e21-92b234846931-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.646370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09b05bff-5dbc-4129-9e21-92b234846931-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.647545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.667221 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngv28\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-kube-api-access-ngv28\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.677263 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.791785 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.848245 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerID="09dd3e6f05c2c59267785489df7839769dc6aaa42f7e3d5a3f7f50e39c347889" exitCode=0 Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.848314 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" event={"ID":"1d10fc66-a842-42a1-9f2b-7d28359712f0","Type":"ContainerDied","Data":"09dd3e6f05c2c59267785489df7839769dc6aaa42f7e3d5a3f7f50e39c347889"} Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.848345 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" event={"ID":"1d10fc66-a842-42a1-9f2b-7d28359712f0","Type":"ContainerStarted","Data":"0151e661bbf8829c255c860d70087ba5f728ea80caa1477aa12965f183a1cab2"} Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.852862 4735 generic.go:334] "Generic (PLEG): container finished" podID="b65dc460-560f-4390-82ee-d72f76482e30" containerID="e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485" exitCode=0 Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.852914 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-kccps" event={"ID":"b65dc460-560f-4390-82ee-d72f76482e30","Type":"ContainerDied","Data":"e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485"} Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.852954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-kccps" event={"ID":"b65dc460-560f-4390-82ee-d72f76482e30","Type":"ContainerStarted","Data":"40db99e0b64315c4c1aaf297075e0dc2333480571f9d2ef5c072b6dc1f61a871"} Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.894695 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:14:18 crc kubenswrapper[4735]: E1008 14:14:18.894911 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:14:18 crc kubenswrapper[4735]: I1008 14:14:18.971395 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:14:18 crc kubenswrapper[4735]: W1008 14:14:18.978033 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3c96e0_cba0_4d26_90a6_3c73bca0994d.slice/crio-edad1b04ceb24cfef1307e04ab7bea6c82fde88520b342757e034c1dbfa2d472 WatchSource:0}: Error finding container edad1b04ceb24cfef1307e04ab7bea6c82fde88520b342757e034c1dbfa2d472: Status 404 returned error can't find the container with id edad1b04ceb24cfef1307e04ab7bea6c82fde88520b342757e034c1dbfa2d472 Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.290580 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:14:19 crc kubenswrapper[4735]: W1008 14:14:19.307697 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b05bff_5dbc_4129_9e21_92b234846931.slice/crio-5d52f61a6a83da937bd6196a00486e47f85e0e0d3535de49168688309a90c84a WatchSource:0}: Error finding container 5d52f61a6a83da937bd6196a00486e47f85e0e0d3535de49168688309a90c84a: Status 404 returned error can't find the container with id 5d52f61a6a83da937bd6196a00486e47f85e0e0d3535de49168688309a90c84a Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.534383 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.536847 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.542976 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.543092 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7sbzl" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.543462 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.543495 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.546609 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.553433 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.560762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.657774 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9nx\" (UniqueName: \"kubernetes.io/projected/79ca7617-d83c-4d45-9118-f8d7b71bc61a-kube-api-access-cl9nx\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.657833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/79ca7617-d83c-4d45-9118-f8d7b71bc61a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658033 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-kolla-config\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658089 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658380 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658463 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-secrets\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658540 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.658591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-config-data-default\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-kolla-config\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760137 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-secrets\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760252 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-config-data-default\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9nx\" (UniqueName: \"kubernetes.io/projected/79ca7617-d83c-4d45-9118-f8d7b71bc61a-kube-api-access-cl9nx\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/79ca7617-d83c-4d45-9118-f8d7b71bc61a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760792 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/79ca7617-d83c-4d45-9118-f8d7b71bc61a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.760998 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-kolla-config\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.761503 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-config-data-default\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.761753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca7617-d83c-4d45-9118-f8d7b71bc61a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.763781 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.763816 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d363ee42373d6466044fd918673a9a9c11bfacc1c3f738b230d8909bf6456a7c/globalmount\"" pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.764952 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.766378 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.770934 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/79ca7617-d83c-4d45-9118-f8d7b71bc61a-secrets\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.858173 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.864018 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.871007 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.871068 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-wcck6" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.871876 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.878712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9nx\" (UniqueName: \"kubernetes.io/projected/79ca7617-d83c-4d45-9118-f8d7b71bc61a-kube-api-access-cl9nx\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.878780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09b05bff-5dbc-4129-9e21-92b234846931","Type":"ContainerStarted","Data":"5d52f61a6a83da937bd6196a00486e47f85e0e0d3535de49168688309a90c84a"} Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.883427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" event={"ID":"1d10fc66-a842-42a1-9f2b-7d28359712f0","Type":"ContainerStarted","Data":"ee92933db4b3f9609d1a5b67a532ff0380bb7a570e36c7df717772d002e05f66"} Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.883994 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.884930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d3c96e0-cba0-4d26-90a6-3c73bca0994d","Type":"ContainerStarted","Data":"edad1b04ceb24cfef1307e04ab7bea6c82fde88520b342757e034c1dbfa2d472"} Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.887057 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-kccps" event={"ID":"b65dc460-560f-4390-82ee-d72f76482e30","Type":"ContainerStarted","Data":"926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e"} Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.888320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.944761 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b7964457-kccps" podStartSLOduration=3.9447371860000002 podStartE2EDuration="3.944737186s" podCreationTimestamp="2025-10-08 14:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:19.920345632 +0000 UTC m=+4709.728650176" watchObservedRunningTime="2025-10-08 14:14:19.944737186 +0000 UTC m=+4709.753041740" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.948340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e7034c45-6e30-46b9-af00-9661a0fc313a\") pod \"openstack-galera-0\" (UID: \"79ca7617-d83c-4d45-9118-f8d7b71bc61a\") " pod="openstack/openstack-galera-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.951167 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" podStartSLOduration=2.95114739 podStartE2EDuration="2.95114739s" podCreationTimestamp="2025-10-08 14:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:19.938151898 +0000 UTC m=+4709.746456442" watchObservedRunningTime="2025-10-08 14:14:19.95114739 +0000 UTC m=+4709.759451944" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.972374 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-kolla-config\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.972500 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-config-data\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:19 crc kubenswrapper[4735]: I1008 14:14:19.972531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hng\" (UniqueName: \"kubernetes.io/projected/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-kube-api-access-f2hng\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.074845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hng\" (UniqueName: \"kubernetes.io/projected/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-kube-api-access-f2hng\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.075007 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-kolla-config\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.075123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-config-data\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.076021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-kolla-config\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.077054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-config-data\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.152177 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.178317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hng\" (UniqueName: \"kubernetes.io/projected/e2c6229f-75a0-4a80-bcb4-74c95f06e34b-kube-api-access-f2hng\") pod \"memcached-0\" (UID: \"e2c6229f-75a0-4a80-bcb4-74c95f06e34b\") " pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.229322 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.633678 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 14:14:20 crc kubenswrapper[4735]: W1008 14:14:20.642948 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79ca7617_d83c_4d45_9118_f8d7b71bc61a.slice/crio-e10ca4ba2d0cd9b1cd7d9d8bf43f22f17f9226b81322e57323a911154fc3b689 WatchSource:0}: Error finding container e10ca4ba2d0cd9b1cd7d9d8bf43f22f17f9226b81322e57323a911154fc3b689: Status 404 returned error can't find the container with id e10ca4ba2d0cd9b1cd7d9d8bf43f22f17f9226b81322e57323a911154fc3b689 Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.704353 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 14:14:20 crc kubenswrapper[4735]: W1008 14:14:20.714007 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c6229f_75a0_4a80_bcb4_74c95f06e34b.slice/crio-7dba27b27abd4ef1254083d60fc3026dfc868d2af6002002a2c52be4c5d7b306 WatchSource:0}: Error finding container 7dba27b27abd4ef1254083d60fc3026dfc868d2af6002002a2c52be4c5d7b306: Status 404 returned error can't find the container with id 7dba27b27abd4ef1254083d60fc3026dfc868d2af6002002a2c52be4c5d7b306 Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.907788 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d3c96e0-cba0-4d26-90a6-3c73bca0994d","Type":"ContainerStarted","Data":"08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661"} Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.907860 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09b05bff-5dbc-4129-9e21-92b234846931","Type":"ContainerStarted","Data":"83e258640454c1fe277b8c0862c7835e7075968fa5b63c2af4659d2669d512f5"} Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.907884 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79ca7617-d83c-4d45-9118-f8d7b71bc61a","Type":"ContainerStarted","Data":"e10ca4ba2d0cd9b1cd7d9d8bf43f22f17f9226b81322e57323a911154fc3b689"} Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.907909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e2c6229f-75a0-4a80-bcb4-74c95f06e34b","Type":"ContainerStarted","Data":"7dba27b27abd4ef1254083d60fc3026dfc868d2af6002002a2c52be4c5d7b306"} Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.960947 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.962456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.969223 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.969562 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.969749 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.970004 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-brn82" Oct 08 14:14:20 crc kubenswrapper[4735]: I1008 14:14:20.970938 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlhv5\" (UniqueName: \"kubernetes.io/projected/03482ef3-1f39-445a-b172-9fa5b97d4b9e-kube-api-access-wlhv5\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092234 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092258 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092285 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03482ef3-1f39-445a-b172-9fa5b97d4b9e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.092611 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.194654 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03482ef3-1f39-445a-b172-9fa5b97d4b9e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.194859 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.194918 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlhv5\" (UniqueName: \"kubernetes.io/projected/03482ef3-1f39-445a-b172-9fa5b97d4b9e-kube-api-access-wlhv5\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.194977 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.195032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.195094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.195155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.195206 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.195307 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.196135 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03482ef3-1f39-445a-b172-9fa5b97d4b9e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.196417 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.196505 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.197547 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03482ef3-1f39-445a-b172-9fa5b97d4b9e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.199523 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.199571 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/32fa635b729fd5ef1cf376960bc24ed234c267bf231edca1856c652eae326771/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.200956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.201045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.204554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03482ef3-1f39-445a-b172-9fa5b97d4b9e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.212848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlhv5\" (UniqueName: \"kubernetes.io/projected/03482ef3-1f39-445a-b172-9fa5b97d4b9e-kube-api-access-wlhv5\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.240949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfee42a7-0f3d-4c0f-a36a-183960a79d53\") pod \"openstack-cell1-galera-0\" (UID: \"03482ef3-1f39-445a-b172-9fa5b97d4b9e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.291565 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:21 crc kubenswrapper[4735]: W1008 14:14:21.790456 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03482ef3_1f39_445a_b172_9fa5b97d4b9e.slice/crio-080e9dc1e3e8b8bd6eea4822137704b6a1c26c01514edcd8000033bbf82f51b8 WatchSource:0}: Error finding container 080e9dc1e3e8b8bd6eea4822137704b6a1c26c01514edcd8000033bbf82f51b8: Status 404 returned error can't find the container with id 080e9dc1e3e8b8bd6eea4822137704b6a1c26c01514edcd8000033bbf82f51b8 Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.798657 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.915096 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79ca7617-d83c-4d45-9118-f8d7b71bc61a","Type":"ContainerStarted","Data":"80e9b528c5cf9db9ec1c1913b961620e6b7cb7858c2a293d5c1e35924a909cda"} Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.916425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e2c6229f-75a0-4a80-bcb4-74c95f06e34b","Type":"ContainerStarted","Data":"7c266fb1044b4a61e9fcbfa9f5ff20ff6d40c41d15eb43b0ddba858d9eee3141"} Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.917029 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.919331 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03482ef3-1f39-445a-b172-9fa5b97d4b9e","Type":"ContainerStarted","Data":"080e9dc1e3e8b8bd6eea4822137704b6a1c26c01514edcd8000033bbf82f51b8"} Oct 08 14:14:21 crc kubenswrapper[4735]: I1008 14:14:21.957201 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.957183857 podStartE2EDuration="2.957183857s" podCreationTimestamp="2025-10-08 14:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:21.955562555 +0000 UTC m=+4711.763867109" watchObservedRunningTime="2025-10-08 14:14:21.957183857 +0000 UTC m=+4711.765488401" Oct 08 14:14:22 crc kubenswrapper[4735]: I1008 14:14:22.930093 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03482ef3-1f39-445a-b172-9fa5b97d4b9e","Type":"ContainerStarted","Data":"a240d82294eda82afc3990198e58f3f056f75a297cc702d8a5edf6f63871ed98"} Oct 08 14:14:24 crc kubenswrapper[4735]: I1008 14:14:24.950953 4735 generic.go:334] "Generic (PLEG): container finished" podID="79ca7617-d83c-4d45-9118-f8d7b71bc61a" containerID="80e9b528c5cf9db9ec1c1913b961620e6b7cb7858c2a293d5c1e35924a909cda" exitCode=0 Oct 08 14:14:24 crc kubenswrapper[4735]: I1008 14:14:24.951083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79ca7617-d83c-4d45-9118-f8d7b71bc61a","Type":"ContainerDied","Data":"80e9b528c5cf9db9ec1c1913b961620e6b7cb7858c2a293d5c1e35924a909cda"} Oct 08 14:14:25 crc kubenswrapper[4735]: I1008 14:14:25.231187 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 14:14:25 crc kubenswrapper[4735]: I1008 14:14:25.964357 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"79ca7617-d83c-4d45-9118-f8d7b71bc61a","Type":"ContainerStarted","Data":"a44366d441c62840991367e30b8efe261d3b7f992fd4c31e63b9847caa2808f1"} Oct 08 14:14:25 crc kubenswrapper[4735]: I1008 14:14:25.997309 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.997287078 podStartE2EDuration="7.997287078s" podCreationTimestamp="2025-10-08 14:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:25.990371981 +0000 UTC m=+4715.798676545" watchObservedRunningTime="2025-10-08 14:14:25.997287078 +0000 UTC m=+4715.805591632" Oct 08 14:14:26 crc kubenswrapper[4735]: I1008 14:14:26.974731 4735 generic.go:334] "Generic (PLEG): container finished" podID="03482ef3-1f39-445a-b172-9fa5b97d4b9e" containerID="a240d82294eda82afc3990198e58f3f056f75a297cc702d8a5edf6f63871ed98" exitCode=0 Oct 08 14:14:26 crc kubenswrapper[4735]: I1008 14:14:26.974826 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03482ef3-1f39-445a-b172-9fa5b97d4b9e","Type":"ContainerDied","Data":"a240d82294eda82afc3990198e58f3f056f75a297cc702d8a5edf6f63871ed98"} Oct 08 14:14:27 crc kubenswrapper[4735]: I1008 14:14:27.290701 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:27 crc kubenswrapper[4735]: I1008 14:14:27.619556 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:14:27 crc kubenswrapper[4735]: I1008 14:14:27.670284 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-kccps"] Oct 08 14:14:27 crc kubenswrapper[4735]: I1008 14:14:27.993627 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03482ef3-1f39-445a-b172-9fa5b97d4b9e","Type":"ContainerStarted","Data":"8afe19d9bc71146a6293928f42031cb9fb69f5f2a78e8f260da7d28aa1cc7eaa"} Oct 08 14:14:27 crc kubenswrapper[4735]: I1008 14:14:27.993751 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b7964457-kccps" podUID="b65dc460-560f-4390-82ee-d72f76482e30" containerName="dnsmasq-dns" containerID="cri-o://926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e" gracePeriod=10 Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.019864 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.019839007 podStartE2EDuration="9.019839007s" podCreationTimestamp="2025-10-08 14:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:28.016168273 +0000 UTC m=+4717.824472817" watchObservedRunningTime="2025-10-08 14:14:28.019839007 +0000 UTC m=+4717.828143541" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.394928 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.513200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfcj6\" (UniqueName: \"kubernetes.io/projected/b65dc460-560f-4390-82ee-d72f76482e30-kube-api-access-hfcj6\") pod \"b65dc460-560f-4390-82ee-d72f76482e30\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.513258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-dns-svc\") pod \"b65dc460-560f-4390-82ee-d72f76482e30\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.513331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-config\") pod \"b65dc460-560f-4390-82ee-d72f76482e30\" (UID: \"b65dc460-560f-4390-82ee-d72f76482e30\") " Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.518581 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65dc460-560f-4390-82ee-d72f76482e30-kube-api-access-hfcj6" (OuterVolumeSpecName: "kube-api-access-hfcj6") pod "b65dc460-560f-4390-82ee-d72f76482e30" (UID: "b65dc460-560f-4390-82ee-d72f76482e30"). InnerVolumeSpecName "kube-api-access-hfcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.546906 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b65dc460-560f-4390-82ee-d72f76482e30" (UID: "b65dc460-560f-4390-82ee-d72f76482e30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.552631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-config" (OuterVolumeSpecName: "config") pod "b65dc460-560f-4390-82ee-d72f76482e30" (UID: "b65dc460-560f-4390-82ee-d72f76482e30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.615383 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfcj6\" (UniqueName: \"kubernetes.io/projected/b65dc460-560f-4390-82ee-d72f76482e30-kube-api-access-hfcj6\") on node \"crc\" DevicePath \"\"" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.615420 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:14:28 crc kubenswrapper[4735]: I1008 14:14:28.615429 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65dc460-560f-4390-82ee-d72f76482e30-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.002869 4735 generic.go:334] "Generic (PLEG): container finished" podID="b65dc460-560f-4390-82ee-d72f76482e30" containerID="926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e" exitCode=0 Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.002925 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-kccps" event={"ID":"b65dc460-560f-4390-82ee-d72f76482e30","Type":"ContainerDied","Data":"926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e"} Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.002936 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-kccps" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.002952 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-kccps" event={"ID":"b65dc460-560f-4390-82ee-d72f76482e30","Type":"ContainerDied","Data":"40db99e0b64315c4c1aaf297075e0dc2333480571f9d2ef5c072b6dc1f61a871"} Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.002971 4735 scope.go:117] "RemoveContainer" containerID="926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.020442 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-kccps"] Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.026208 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-kccps"] Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.028163 4735 scope.go:117] "RemoveContainer" containerID="e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.052754 4735 scope.go:117] "RemoveContainer" containerID="926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e" Oct 08 14:14:29 crc kubenswrapper[4735]: E1008 14:14:29.053278 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e\": container with ID starting with 926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e not found: ID does not exist" containerID="926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.053324 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e"} err="failed to get container status \"926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e\": rpc error: code = NotFound desc = could not find container \"926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e\": container with ID starting with 926f22ab8ea53c906ad54f50ebe42c27356e3b1496c58aa2ee587e2610b7c25e not found: ID does not exist" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.053348 4735 scope.go:117] "RemoveContainer" containerID="e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485" Oct 08 14:14:29 crc kubenswrapper[4735]: E1008 14:14:29.053845 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485\": container with ID starting with e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485 not found: ID does not exist" containerID="e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485" Oct 08 14:14:29 crc kubenswrapper[4735]: I1008 14:14:29.053875 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485"} err="failed to get container status \"e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485\": rpc error: code = NotFound desc = could not find container \"e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485\": container with ID starting with e9ea54c30f8a40f3008eeaf6596337cf1bd76a2d72ccc0d8bb111b349f5e4485 not found: ID does not exist" Oct 08 14:14:30 crc kubenswrapper[4735]: I1008 14:14:30.152410 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 14:14:30 crc kubenswrapper[4735]: I1008 14:14:30.152757 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 14:14:30 crc kubenswrapper[4735]: I1008 14:14:30.202504 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 14:14:30 crc kubenswrapper[4735]: I1008 14:14:30.906061 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65dc460-560f-4390-82ee-d72f76482e30" path="/var/lib/kubelet/pods/b65dc460-560f-4390-82ee-d72f76482e30/volumes" Oct 08 14:14:31 crc kubenswrapper[4735]: I1008 14:14:31.062918 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 14:14:31 crc kubenswrapper[4735]: I1008 14:14:31.292433 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:31 crc kubenswrapper[4735]: I1008 14:14:31.293039 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:32 crc kubenswrapper[4735]: I1008 14:14:32.894520 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:14:32 crc kubenswrapper[4735]: E1008 14:14:32.894842 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:14:33 crc kubenswrapper[4735]: I1008 14:14:33.347857 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:33 crc kubenswrapper[4735]: I1008 14:14:33.390914 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 14:14:43 crc kubenswrapper[4735]: I1008 14:14:43.895104 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:14:43 crc kubenswrapper[4735]: E1008 14:14:43.895971 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:14:52 crc kubenswrapper[4735]: I1008 14:14:52.196608 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerID="08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661" exitCode=0 Oct 08 14:14:52 crc kubenswrapper[4735]: I1008 14:14:52.196699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d3c96e0-cba0-4d26-90a6-3c73bca0994d","Type":"ContainerDied","Data":"08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661"} Oct 08 14:14:53 crc kubenswrapper[4735]: I1008 14:14:53.207020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d3c96e0-cba0-4d26-90a6-3c73bca0994d","Type":"ContainerStarted","Data":"987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d"} Oct 08 14:14:53 crc kubenswrapper[4735]: I1008 14:14:53.207756 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:14:53 crc kubenswrapper[4735]: I1008 14:14:53.208727 4735 generic.go:334] "Generic (PLEG): container finished" podID="09b05bff-5dbc-4129-9e21-92b234846931" containerID="83e258640454c1fe277b8c0862c7835e7075968fa5b63c2af4659d2669d512f5" exitCode=0 Oct 08 14:14:53 crc kubenswrapper[4735]: I1008 14:14:53.208776 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09b05bff-5dbc-4129-9e21-92b234846931","Type":"ContainerDied","Data":"83e258640454c1fe277b8c0862c7835e7075968fa5b63c2af4659d2669d512f5"} Oct 08 14:14:53 crc kubenswrapper[4735]: I1008 14:14:53.231665 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.231640983 podStartE2EDuration="36.231640983s" podCreationTimestamp="2025-10-08 14:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:53.230599706 +0000 UTC m=+4743.038904280" watchObservedRunningTime="2025-10-08 14:14:53.231640983 +0000 UTC m=+4743.039945537" Oct 08 14:14:54 crc kubenswrapper[4735]: I1008 14:14:54.236821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09b05bff-5dbc-4129-9e21-92b234846931","Type":"ContainerStarted","Data":"79c1e7ff9c4b3eff78b4b75b1d8a91bcb7c92abd7841bdc51ce6c6d7ef1c1dfc"} Oct 08 14:14:54 crc kubenswrapper[4735]: I1008 14:14:54.237745 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 14:14:54 crc kubenswrapper[4735]: I1008 14:14:54.262709 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.262683509 podStartE2EDuration="37.262683509s" podCreationTimestamp="2025-10-08 14:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:14:54.254716395 +0000 UTC m=+4744.063020959" watchObservedRunningTime="2025-10-08 14:14:54.262683509 +0000 UTC m=+4744.070988073" Oct 08 14:14:55 crc kubenswrapper[4735]: I1008 14:14:55.895593 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:14:55 crc kubenswrapper[4735]: E1008 14:14:55.896077 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.144087 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6"] Oct 08 14:15:00 crc kubenswrapper[4735]: E1008 14:15:00.144809 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65dc460-560f-4390-82ee-d72f76482e30" containerName="init" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.144833 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65dc460-560f-4390-82ee-d72f76482e30" containerName="init" Oct 08 14:15:00 crc kubenswrapper[4735]: E1008 14:15:00.144869 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65dc460-560f-4390-82ee-d72f76482e30" containerName="dnsmasq-dns" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.144879 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65dc460-560f-4390-82ee-d72f76482e30" containerName="dnsmasq-dns" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.145085 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65dc460-560f-4390-82ee-d72f76482e30" containerName="dnsmasq-dns" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.145817 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.149557 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.149752 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.153246 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6"] Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.287544 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25e167fc-9d02-4424-9f15-e49edef96765-secret-volume\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.287697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e167fc-9d02-4424-9f15-e49edef96765-config-volume\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.287763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9j8\" (UniqueName: \"kubernetes.io/projected/25e167fc-9d02-4424-9f15-e49edef96765-kube-api-access-5j9j8\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.389787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25e167fc-9d02-4424-9f15-e49edef96765-secret-volume\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.390936 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e167fc-9d02-4424-9f15-e49edef96765-config-volume\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.390994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9j8\" (UniqueName: \"kubernetes.io/projected/25e167fc-9d02-4424-9f15-e49edef96765-kube-api-access-5j9j8\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.392139 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e167fc-9d02-4424-9f15-e49edef96765-config-volume\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.399082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25e167fc-9d02-4424-9f15-e49edef96765-secret-volume\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.426578 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9j8\" (UniqueName: \"kubernetes.io/projected/25e167fc-9d02-4424-9f15-e49edef96765-kube-api-access-5j9j8\") pod \"collect-profiles-29332215-vqvm6\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.473161 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:00 crc kubenswrapper[4735]: I1008 14:15:00.918683 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6"] Oct 08 14:15:00 crc kubenswrapper[4735]: W1008 14:15:00.927425 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e167fc_9d02_4424_9f15_e49edef96765.slice/crio-9351b718f71b6487349d3f2353f0da58ec59d11b828e83617f26ab00f3a1bebb WatchSource:0}: Error finding container 9351b718f71b6487349d3f2353f0da58ec59d11b828e83617f26ab00f3a1bebb: Status 404 returned error can't find the container with id 9351b718f71b6487349d3f2353f0da58ec59d11b828e83617f26ab00f3a1bebb Oct 08 14:15:01 crc kubenswrapper[4735]: I1008 14:15:01.286692 4735 generic.go:334] "Generic (PLEG): container finished" podID="25e167fc-9d02-4424-9f15-e49edef96765" containerID="a71dbeadb6891a8bf950a1384a01d172a4347b9ee91bec823dc65b92c5b13445" exitCode=0 Oct 08 14:15:01 crc kubenswrapper[4735]: I1008 14:15:01.286739 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" event={"ID":"25e167fc-9d02-4424-9f15-e49edef96765","Type":"ContainerDied","Data":"a71dbeadb6891a8bf950a1384a01d172a4347b9ee91bec823dc65b92c5b13445"} Oct 08 14:15:01 crc kubenswrapper[4735]: I1008 14:15:01.287764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" event={"ID":"25e167fc-9d02-4424-9f15-e49edef96765","Type":"ContainerStarted","Data":"9351b718f71b6487349d3f2353f0da58ec59d11b828e83617f26ab00f3a1bebb"} Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.544058 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.623745 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9j8\" (UniqueName: \"kubernetes.io/projected/25e167fc-9d02-4424-9f15-e49edef96765-kube-api-access-5j9j8\") pod \"25e167fc-9d02-4424-9f15-e49edef96765\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.623901 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e167fc-9d02-4424-9f15-e49edef96765-config-volume\") pod \"25e167fc-9d02-4424-9f15-e49edef96765\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.623935 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25e167fc-9d02-4424-9f15-e49edef96765-secret-volume\") pod \"25e167fc-9d02-4424-9f15-e49edef96765\" (UID: \"25e167fc-9d02-4424-9f15-e49edef96765\") " Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.624864 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e167fc-9d02-4424-9f15-e49edef96765-config-volume" (OuterVolumeSpecName: "config-volume") pod "25e167fc-9d02-4424-9f15-e49edef96765" (UID: "25e167fc-9d02-4424-9f15-e49edef96765"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.629289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e167fc-9d02-4424-9f15-e49edef96765-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25e167fc-9d02-4424-9f15-e49edef96765" (UID: "25e167fc-9d02-4424-9f15-e49edef96765"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.629805 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e167fc-9d02-4424-9f15-e49edef96765-kube-api-access-5j9j8" (OuterVolumeSpecName: "kube-api-access-5j9j8") pod "25e167fc-9d02-4424-9f15-e49edef96765" (UID: "25e167fc-9d02-4424-9f15-e49edef96765"). InnerVolumeSpecName "kube-api-access-5j9j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.725175 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e167fc-9d02-4424-9f15-e49edef96765-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.725215 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25e167fc-9d02-4424-9f15-e49edef96765-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:02 crc kubenswrapper[4735]: I1008 14:15:02.725226 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9j8\" (UniqueName: \"kubernetes.io/projected/25e167fc-9d02-4424-9f15-e49edef96765-kube-api-access-5j9j8\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:03 crc kubenswrapper[4735]: I1008 14:15:03.305257 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" event={"ID":"25e167fc-9d02-4424-9f15-e49edef96765","Type":"ContainerDied","Data":"9351b718f71b6487349d3f2353f0da58ec59d11b828e83617f26ab00f3a1bebb"} Oct 08 14:15:03 crc kubenswrapper[4735]: I1008 14:15:03.305302 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9351b718f71b6487349d3f2353f0da58ec59d11b828e83617f26ab00f3a1bebb" Oct 08 14:15:03 crc kubenswrapper[4735]: I1008 14:15:03.305305 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6" Oct 08 14:15:03 crc kubenswrapper[4735]: I1008 14:15:03.610666 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7"] Oct 08 14:15:03 crc kubenswrapper[4735]: I1008 14:15:03.615818 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332170-gh5m7"] Oct 08 14:15:04 crc kubenswrapper[4735]: I1008 14:15:04.905962 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339b07cc-0402-457f-bb34-54b687ecab27" path="/var/lib/kubelet/pods/339b07cc-0402-457f-bb34-54b687ecab27/volumes" Oct 08 14:15:07 crc kubenswrapper[4735]: I1008 14:15:07.646222 4735 scope.go:117] "RemoveContainer" containerID="d1017f850ef74255e647c8fe7b81efd3452772d20c91d0d1577ab0b8a45967b5" Oct 08 14:15:08 crc kubenswrapper[4735]: I1008 14:15:08.516115 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:08 crc kubenswrapper[4735]: I1008 14:15:08.793646 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 14:15:09 crc kubenswrapper[4735]: I1008 14:15:09.895276 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:15:09 crc kubenswrapper[4735]: E1008 14:15:09.895494 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.291406 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-8cpm2"] Oct 08 14:15:13 crc kubenswrapper[4735]: E1008 14:15:13.292014 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e167fc-9d02-4424-9f15-e49edef96765" containerName="collect-profiles" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.292026 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e167fc-9d02-4424-9f15-e49edef96765" containerName="collect-profiles" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.292168 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e167fc-9d02-4424-9f15-e49edef96765" containerName="collect-profiles" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.292934 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.306037 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-8cpm2"] Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.386889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-config\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.387203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jms6p\" (UniqueName: \"kubernetes.io/projected/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-kube-api-access-jms6p\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.387320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-dns-svc\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.488570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jms6p\" (UniqueName: \"kubernetes.io/projected/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-kube-api-access-jms6p\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.488977 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-dns-svc\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.489170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-config\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.489757 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-dns-svc\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.489918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-config\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.674714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jms6p\" (UniqueName: \"kubernetes.io/projected/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-kube-api-access-jms6p\") pod \"dnsmasq-dns-5fdc957c47-8cpm2\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:13 crc kubenswrapper[4735]: I1008 14:15:13.911866 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:14 crc kubenswrapper[4735]: I1008 14:15:14.004634 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:15:14 crc kubenswrapper[4735]: I1008 14:15:14.473272 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-8cpm2"] Oct 08 14:15:14 crc kubenswrapper[4735]: I1008 14:15:14.800003 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:15:15 crc kubenswrapper[4735]: I1008 14:15:15.406179 4735 generic.go:334] "Generic (PLEG): container finished" podID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerID="6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1" exitCode=0 Oct 08 14:15:15 crc kubenswrapper[4735]: I1008 14:15:15.406229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" event={"ID":"9d5da6e7-5ff5-4ef1-b678-412c74f96b10","Type":"ContainerDied","Data":"6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1"} Oct 08 14:15:15 crc kubenswrapper[4735]: I1008 14:15:15.406286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" event={"ID":"9d5da6e7-5ff5-4ef1-b678-412c74f96b10","Type":"ContainerStarted","Data":"c2aa5bf3c159a980b56a2dc3fc7c2579e6f8ea7deaad93145b26fcc17cee9a7e"} Oct 08 14:15:16 crc kubenswrapper[4735]: I1008 14:15:16.030449 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="rabbitmq" containerID="cri-o://79c1e7ff9c4b3eff78b4b75b1d8a91bcb7c92abd7841bdc51ce6c6d7ef1c1dfc" gracePeriod=604798 Oct 08 14:15:16 crc kubenswrapper[4735]: I1008 14:15:16.420353 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" event={"ID":"9d5da6e7-5ff5-4ef1-b678-412c74f96b10","Type":"ContainerStarted","Data":"d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec"} Oct 08 14:15:16 crc kubenswrapper[4735]: I1008 14:15:16.420809 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:16 crc kubenswrapper[4735]: I1008 14:15:16.442254 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" podStartSLOduration=3.442204816 podStartE2EDuration="3.442204816s" podCreationTimestamp="2025-10-08 14:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:15:16.439502787 +0000 UTC m=+4766.247807341" watchObservedRunningTime="2025-10-08 14:15:16.442204816 +0000 UTC m=+4766.250509380" Oct 08 14:15:16 crc kubenswrapper[4735]: I1008 14:15:16.590530 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="rabbitmq" containerID="cri-o://987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d" gracePeriod=604799 Oct 08 14:15:18 crc kubenswrapper[4735]: I1008 14:15:18.514882 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.236:5672: connect: connection refused" Oct 08 14:15:18 crc kubenswrapper[4735]: I1008 14:15:18.793191 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5672: connect: connection refused" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.463396 4735 generic.go:334] "Generic (PLEG): container finished" podID="09b05bff-5dbc-4129-9e21-92b234846931" containerID="79c1e7ff9c4b3eff78b4b75b1d8a91bcb7c92abd7841bdc51ce6c6d7ef1c1dfc" exitCode=0 Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.463709 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09b05bff-5dbc-4129-9e21-92b234846931","Type":"ContainerDied","Data":"79c1e7ff9c4b3eff78b4b75b1d8a91bcb7c92abd7841bdc51ce6c6d7ef1c1dfc"} Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.861762 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.935598 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-erlang-cookie\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.935644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-server-conf\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.935723 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09b05bff-5dbc-4129-9e21-92b234846931-pod-info\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.935744 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-confd\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.935791 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09b05bff-5dbc-4129-9e21-92b234846931-erlang-cookie-secret\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.935823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-plugins-conf\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.936026 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.936115 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-plugins\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.936141 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngv28\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-kube-api-access-ngv28\") pod \"09b05bff-5dbc-4129-9e21-92b234846931\" (UID: \"09b05bff-5dbc-4129-9e21-92b234846931\") " Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.936849 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.937571 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.937708 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.944084 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b05bff-5dbc-4129-9e21-92b234846931-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.952318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/09b05bff-5dbc-4129-9e21-92b234846931-pod-info" (OuterVolumeSpecName: "pod-info") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.967050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-kube-api-access-ngv28" (OuterVolumeSpecName: "kube-api-access-ngv28") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "kube-api-access-ngv28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.975876 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730" (OuterVolumeSpecName: "persistence") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "pvc-d8d37369-b842-4be2-b6f3-76368d3ee730". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 14:15:22 crc kubenswrapper[4735]: I1008 14:15:22.981267 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-server-conf" (OuterVolumeSpecName: "server-conf") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038627 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngv28\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-kube-api-access-ngv28\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038660 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038673 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038686 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09b05bff-5dbc-4129-9e21-92b234846931-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038698 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09b05bff-5dbc-4129-9e21-92b234846931-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038709 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09b05bff-5dbc-4129-9e21-92b234846931-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038740 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") on node \"crc\" " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.038753 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.044877 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "09b05bff-5dbc-4129-9e21-92b234846931" (UID: "09b05bff-5dbc-4129-9e21-92b234846931"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.070897 4735 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.071070 4735 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d8d37369-b842-4be2-b6f3-76368d3ee730" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730") on node "crc" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.140428 4735 reconciler_common.go:293] "Volume detached for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.140808 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09b05bff-5dbc-4129-9e21-92b234846931-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.341017 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.443908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52mh9\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-kube-api-access-52mh9\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.443961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-plugins-conf\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.443994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-erlang-cookie\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444201 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444252 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-server-conf\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444301 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-erlang-cookie-secret\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444351 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-plugins\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-pod-info\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.444795 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.445071 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.445863 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.448210 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.448912 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-pod-info" (OuterVolumeSpecName: "pod-info") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.450793 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-kube-api-access-52mh9" (OuterVolumeSpecName: "kube-api-access-52mh9") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "kube-api-access-52mh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.455916 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991" (OuterVolumeSpecName: "persistence") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "pvc-0f93cc8d-707c-4bae-944b-50c607295991". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.471173 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-server-conf" (OuterVolumeSpecName: "server-conf") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.474390 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerID="987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d" exitCode=0 Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.474509 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d3c96e0-cba0-4d26-90a6-3c73bca0994d","Type":"ContainerDied","Data":"987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d"} Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.474675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d3c96e0-cba0-4d26-90a6-3c73bca0994d","Type":"ContainerDied","Data":"edad1b04ceb24cfef1307e04ab7bea6c82fde88520b342757e034c1dbfa2d472"} Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.474701 4735 scope.go:117] "RemoveContainer" containerID="987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.474537 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.479242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09b05bff-5dbc-4129-9e21-92b234846931","Type":"ContainerDied","Data":"5d52f61a6a83da937bd6196a00486e47f85e0e0d3535de49168688309a90c84a"} Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.479305 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545008 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd\") pod \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\" (UID: \"5d3c96e0-cba0-4d26-90a6-3c73bca0994d\") " Oct 08 14:15:23 crc kubenswrapper[4735]: W1008 14:15:23.545424 4735 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5d3c96e0-cba0-4d26-90a6-3c73bca0994d/volumes/kubernetes.io~projected/rabbitmq-confd Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545503 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5d3c96e0-cba0-4d26-90a6-3c73bca0994d" (UID: "5d3c96e0-cba0-4d26-90a6-3c73bca0994d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545855 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545896 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") on node \"crc\" " Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545911 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545920 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545929 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545939 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545948 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52mh9\" (UniqueName: \"kubernetes.io/projected/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-kube-api-access-52mh9\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545955 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.545964 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d3c96e0-cba0-4d26-90a6-3c73bca0994d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.559219 4735 scope.go:117] "RemoveContainer" containerID="08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.561460 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.565230 4735 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.565411 4735 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0f93cc8d-707c-4bae-944b-50c607295991" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991") on node "crc" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.572193 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582028 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.582339 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="setup-container" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582352 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="setup-container" Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.582374 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="rabbitmq" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582380 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="rabbitmq" Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.582390 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="rabbitmq" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582397 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="rabbitmq" Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.582415 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="setup-container" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582421 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="setup-container" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582584 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b05bff-5dbc-4129-9e21-92b234846931" containerName="rabbitmq" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.582608 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" containerName="rabbitmq" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.584171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.588901 4735 scope.go:117] "RemoveContainer" containerID="987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d" Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.589716 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d\": container with ID starting with 987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d not found: ID does not exist" containerID="987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.589744 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d"} err="failed to get container status \"987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d\": rpc error: code = NotFound desc = could not find container \"987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d\": container with ID starting with 987ba9ff642040fc449ac99965a4df646f15631811b47cf6cf7901560ef7533d not found: ID does not exist" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.589769 4735 scope.go:117] "RemoveContainer" containerID="08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661" Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.590241 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661\": container with ID starting with 08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661 not found: ID does not exist" containerID="08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.590285 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661"} err="failed to get container status \"08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661\": rpc error: code = NotFound desc = could not find container \"08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661\": container with ID starting with 08c4d1e969c96c545e5c5a8800caf590d47a712a73d90edc6aba0a128b1e8661 not found: ID does not exist" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.590313 4735 scope.go:117] "RemoveContainer" containerID="79c1e7ff9c4b3eff78b4b75b1d8a91bcb7c92abd7841bdc51ce6c6d7ef1c1dfc" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.591299 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-s9kh2" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.591534 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.591743 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.592972 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.593334 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.597247 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.620930 4735 scope.go:117] "RemoveContainer" containerID="83e258640454c1fe277b8c0862c7835e7075968fa5b63c2af4659d2669d512f5" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647620 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647688 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/792b2295-d7e4-41e9-b0c4-c789d057d8f0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647740 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5fn\" (UniqueName: \"kubernetes.io/projected/792b2295-d7e4-41e9-b0c4-c789d057d8f0-kube-api-access-xg5fn\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647758 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/792b2295-d7e4-41e9-b0c4-c789d057d8f0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647788 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647811 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/792b2295-d7e4-41e9-b0c4-c789d057d8f0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.647961 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.648065 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/792b2295-d7e4-41e9-b0c4-c789d057d8f0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.648108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.648229 4735 reconciler_common.go:293] "Volume detached for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750153 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/792b2295-d7e4-41e9-b0c4-c789d057d8f0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750174 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750202 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750246 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/792b2295-d7e4-41e9-b0c4-c789d057d8f0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5fn\" (UniqueName: \"kubernetes.io/projected/792b2295-d7e4-41e9-b0c4-c789d057d8f0-kube-api-access-xg5fn\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/792b2295-d7e4-41e9-b0c4-c789d057d8f0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/792b2295-d7e4-41e9-b0c4-c789d057d8f0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.750811 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.751388 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/792b2295-d7e4-41e9-b0c4-c789d057d8f0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.751744 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.752067 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/792b2295-d7e4-41e9-b0c4-c789d057d8f0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.753268 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.753386 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff4eb5c2b518539e744c469d8195362bbdfa451f171055ff1192f9fd3f4a9c48/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.754692 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/792b2295-d7e4-41e9-b0c4-c789d057d8f0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.755140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/792b2295-d7e4-41e9-b0c4-c789d057d8f0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.755819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/792b2295-d7e4-41e9-b0c4-c789d057d8f0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.770450 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5fn\" (UniqueName: \"kubernetes.io/projected/792b2295-d7e4-41e9-b0c4-c789d057d8f0-kube-api-access-xg5fn\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.790009 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8d37369-b842-4be2-b6f3-76368d3ee730\") pod \"rabbitmq-server-0\" (UID: \"792b2295-d7e4-41e9-b0c4-c789d057d8f0\") " pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.804310 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.809258 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.832308 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.835548 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.838331 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.838928 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x2hvh" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.839214 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.839731 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.841053 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.846883 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.894684 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:15:23 crc kubenswrapper[4735]: E1008 14:15:23.895028 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.913671 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.914104 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.954852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/55e7bd11-2cfa-4bee-98b9-32db543ab744-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/55e7bd11-2cfa-4bee-98b9-32db543ab744-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/55e7bd11-2cfa-4bee-98b9-32db543ab744-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955251 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrgv4\" (UniqueName: \"kubernetes.io/projected/55e7bd11-2cfa-4bee-98b9-32db543ab744-kube-api-access-lrgv4\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/55e7bd11-2cfa-4bee-98b9-32db543ab744-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.955424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.979992 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-scc2x"] Oct 08 14:15:23 crc kubenswrapper[4735]: I1008 14:15:23.980254 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerName="dnsmasq-dns" containerID="cri-o://ee92933db4b3f9609d1a5b67a532ff0380bb7a570e36c7df717772d002e05f66" gracePeriod=10 Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.056903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrgv4\" (UniqueName: \"kubernetes.io/projected/55e7bd11-2cfa-4bee-98b9-32db543ab744-kube-api-access-lrgv4\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.057715 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.057767 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/55e7bd11-2cfa-4bee-98b9-32db543ab744-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.057805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.057850 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.057909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.058091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/55e7bd11-2cfa-4bee-98b9-32db543ab744-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.058205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/55e7bd11-2cfa-4bee-98b9-32db543ab744-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.058286 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/55e7bd11-2cfa-4bee-98b9-32db543ab744-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.058357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.059562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.059819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/55e7bd11-2cfa-4bee-98b9-32db543ab744-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.061149 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.061171 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d521554e4737493212cb4c3d77bb11e31c05cf0de2d3b8b216d60249a9c8b916/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.061248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/55e7bd11-2cfa-4bee-98b9-32db543ab744-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.190470 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.273566 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/55e7bd11-2cfa-4bee-98b9-32db543ab744-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.274044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrgv4\" (UniqueName: \"kubernetes.io/projected/55e7bd11-2cfa-4bee-98b9-32db543ab744-kube-api-access-lrgv4\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.274698 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/55e7bd11-2cfa-4bee-98b9-32db543ab744-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.274828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/55e7bd11-2cfa-4bee-98b9-32db543ab744-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.297010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0f93cc8d-707c-4bae-944b-50c607295991\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f93cc8d-707c-4bae-944b-50c607295991\") pod \"rabbitmq-cell1-server-0\" (UID: \"55e7bd11-2cfa-4bee-98b9-32db543ab744\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.462062 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.488174 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerID="ee92933db4b3f9609d1a5b67a532ff0380bb7a570e36c7df717772d002e05f66" exitCode=0 Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.488247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" event={"ID":"1d10fc66-a842-42a1-9f2b-7d28359712f0","Type":"ContainerDied","Data":"ee92933db4b3f9609d1a5b67a532ff0380bb7a570e36c7df717772d002e05f66"} Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.490061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"792b2295-d7e4-41e9-b0c4-c789d057d8f0","Type":"ContainerStarted","Data":"a81e07b0daf83f79a3a47bf90bb50e676d890a31b017dea0ff84fcb96022f3cc"} Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.638912 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.710284 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.768836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctfpl\" (UniqueName: \"kubernetes.io/projected/1d10fc66-a842-42a1-9f2b-7d28359712f0-kube-api-access-ctfpl\") pod \"1d10fc66-a842-42a1-9f2b-7d28359712f0\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.768887 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-config\") pod \"1d10fc66-a842-42a1-9f2b-7d28359712f0\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.768915 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-dns-svc\") pod \"1d10fc66-a842-42a1-9f2b-7d28359712f0\" (UID: \"1d10fc66-a842-42a1-9f2b-7d28359712f0\") " Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.777295 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d10fc66-a842-42a1-9f2b-7d28359712f0-kube-api-access-ctfpl" (OuterVolumeSpecName: "kube-api-access-ctfpl") pod "1d10fc66-a842-42a1-9f2b-7d28359712f0" (UID: "1d10fc66-a842-42a1-9f2b-7d28359712f0"). InnerVolumeSpecName "kube-api-access-ctfpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:15:24 crc kubenswrapper[4735]: W1008 14:15:24.779714 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e7bd11_2cfa_4bee_98b9_32db543ab744.slice/crio-219096b8af9e16187bb08a7ef37a630acd84343002d2c79edcda670dfdcea4e7 WatchSource:0}: Error finding container 219096b8af9e16187bb08a7ef37a630acd84343002d2c79edcda670dfdcea4e7: Status 404 returned error can't find the container with id 219096b8af9e16187bb08a7ef37a630acd84343002d2c79edcda670dfdcea4e7 Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.806240 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-config" (OuterVolumeSpecName: "config") pod "1d10fc66-a842-42a1-9f2b-7d28359712f0" (UID: "1d10fc66-a842-42a1-9f2b-7d28359712f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.825945 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d10fc66-a842-42a1-9f2b-7d28359712f0" (UID: "1d10fc66-a842-42a1-9f2b-7d28359712f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.871245 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctfpl\" (UniqueName: \"kubernetes.io/projected/1d10fc66-a842-42a1-9f2b-7d28359712f0-kube-api-access-ctfpl\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.871301 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.871320 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d10fc66-a842-42a1-9f2b-7d28359712f0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.912911 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b05bff-5dbc-4129-9e21-92b234846931" path="/var/lib/kubelet/pods/09b05bff-5dbc-4129-9e21-92b234846931/volumes" Oct 08 14:15:24 crc kubenswrapper[4735]: I1008 14:15:24.914175 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3c96e0-cba0-4d26-90a6-3c73bca0994d" path="/var/lib/kubelet/pods/5d3c96e0-cba0-4d26-90a6-3c73bca0994d/volumes" Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.505789 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"55e7bd11-2cfa-4bee-98b9-32db543ab744","Type":"ContainerStarted","Data":"219096b8af9e16187bb08a7ef37a630acd84343002d2c79edcda670dfdcea4e7"} Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.508400 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" event={"ID":"1d10fc66-a842-42a1-9f2b-7d28359712f0","Type":"ContainerDied","Data":"0151e661bbf8829c255c860d70087ba5f728ea80caa1477aa12965f183a1cab2"} Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.508463 4735 scope.go:117] "RemoveContainer" containerID="ee92933db4b3f9609d1a5b67a532ff0380bb7a570e36c7df717772d002e05f66" Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.508766 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-scc2x" Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.538615 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-scc2x"] Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.540983 4735 scope.go:117] "RemoveContainer" containerID="09dd3e6f05c2c59267785489df7839769dc6aaa42f7e3d5a3f7f50e39c347889" Oct 08 14:15:25 crc kubenswrapper[4735]: I1008 14:15:25.547130 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-scc2x"] Oct 08 14:15:26 crc kubenswrapper[4735]: I1008 14:15:26.517290 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"55e7bd11-2cfa-4bee-98b9-32db543ab744","Type":"ContainerStarted","Data":"558059e85c5a79854dfb01b858bc281a9ee32acf4301ef4a500ab628b2610c25"} Oct 08 14:15:26 crc kubenswrapper[4735]: I1008 14:15:26.521213 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"792b2295-d7e4-41e9-b0c4-c789d057d8f0","Type":"ContainerStarted","Data":"7f85d602660d6292bbffb1b8757b922a34d626c03fcce5efafcb0586e66ad5e5"} Oct 08 14:15:26 crc kubenswrapper[4735]: I1008 14:15:26.903900 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" path="/var/lib/kubelet/pods/1d10fc66-a842-42a1-9f2b-7d28359712f0/volumes" Oct 08 14:15:35 crc kubenswrapper[4735]: I1008 14:15:35.894804 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:15:35 crc kubenswrapper[4735]: E1008 14:15:35.896209 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:15:46 crc kubenswrapper[4735]: I1008 14:15:46.894924 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:15:46 crc kubenswrapper[4735]: E1008 14:15:46.896575 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:15:58 crc kubenswrapper[4735]: I1008 14:15:58.779577 4735 generic.go:334] "Generic (PLEG): container finished" podID="792b2295-d7e4-41e9-b0c4-c789d057d8f0" containerID="7f85d602660d6292bbffb1b8757b922a34d626c03fcce5efafcb0586e66ad5e5" exitCode=0 Oct 08 14:15:58 crc kubenswrapper[4735]: I1008 14:15:58.779679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"792b2295-d7e4-41e9-b0c4-c789d057d8f0","Type":"ContainerDied","Data":"7f85d602660d6292bbffb1b8757b922a34d626c03fcce5efafcb0586e66ad5e5"} Oct 08 14:15:59 crc kubenswrapper[4735]: I1008 14:15:59.792871 4735 generic.go:334] "Generic (PLEG): container finished" podID="55e7bd11-2cfa-4bee-98b9-32db543ab744" containerID="558059e85c5a79854dfb01b858bc281a9ee32acf4301ef4a500ab628b2610c25" exitCode=0 Oct 08 14:15:59 crc kubenswrapper[4735]: I1008 14:15:59.792963 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"55e7bd11-2cfa-4bee-98b9-32db543ab744","Type":"ContainerDied","Data":"558059e85c5a79854dfb01b858bc281a9ee32acf4301ef4a500ab628b2610c25"} Oct 08 14:15:59 crc kubenswrapper[4735]: I1008 14:15:59.797334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"792b2295-d7e4-41e9-b0c4-c789d057d8f0","Type":"ContainerStarted","Data":"e48d6c8f59d52bd8694bea81a23e36e4f5c92d1a291e2ba3c738d0712e803b31"} Oct 08 14:15:59 crc kubenswrapper[4735]: I1008 14:15:59.797636 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 14:15:59 crc kubenswrapper[4735]: I1008 14:15:59.850902 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.850870172 podStartE2EDuration="36.850870172s" podCreationTimestamp="2025-10-08 14:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:15:59.845550136 +0000 UTC m=+4809.653854680" watchObservedRunningTime="2025-10-08 14:15:59.850870172 +0000 UTC m=+4809.659174716" Oct 08 14:16:00 crc kubenswrapper[4735]: I1008 14:16:00.807632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"55e7bd11-2cfa-4bee-98b9-32db543ab744","Type":"ContainerStarted","Data":"4fe762f0d67dbaf25e9fd6dd19e3f700ed5eb901dea53186a71c0ca9ce213755"} Oct 08 14:16:00 crc kubenswrapper[4735]: I1008 14:16:00.808115 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:16:00 crc kubenswrapper[4735]: I1008 14:16:00.833259 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.833237231 podStartE2EDuration="37.833237231s" podCreationTimestamp="2025-10-08 14:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:16:00.828698565 +0000 UTC m=+4810.637003179" watchObservedRunningTime="2025-10-08 14:16:00.833237231 +0000 UTC m=+4810.641541805" Oct 08 14:16:00 crc kubenswrapper[4735]: I1008 14:16:00.903438 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:16:00 crc kubenswrapper[4735]: E1008 14:16:00.904217 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:16:13 crc kubenswrapper[4735]: I1008 14:16:13.918662 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 14:16:14 crc kubenswrapper[4735]: I1008 14:16:14.466626 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 14:16:15 crc kubenswrapper[4735]: I1008 14:16:15.895183 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:16:15 crc kubenswrapper[4735]: E1008 14:16:15.895584 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.628269 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 14:16:20 crc kubenswrapper[4735]: E1008 14:16:20.630283 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerName="init" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.630407 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerName="init" Oct 08 14:16:20 crc kubenswrapper[4735]: E1008 14:16:20.630536 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerName="dnsmasq-dns" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.630626 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerName="dnsmasq-dns" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.630908 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d10fc66-a842-42a1-9f2b-7d28359712f0" containerName="dnsmasq-dns" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.631631 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.643099 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.682254 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ss8tp" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.805628 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsk5d\" (UniqueName: \"kubernetes.io/projected/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09-kube-api-access-vsk5d\") pod \"mariadb-client-1-default\" (UID: \"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09\") " pod="openstack/mariadb-client-1-default" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.907326 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsk5d\" (UniqueName: \"kubernetes.io/projected/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09-kube-api-access-vsk5d\") pod \"mariadb-client-1-default\" (UID: \"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09\") " pod="openstack/mariadb-client-1-default" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.948917 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsk5d\" (UniqueName: \"kubernetes.io/projected/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09-kube-api-access-vsk5d\") pod \"mariadb-client-1-default\" (UID: \"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09\") " pod="openstack/mariadb-client-1-default" Oct 08 14:16:20 crc kubenswrapper[4735]: I1008 14:16:20.995664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 14:16:21 crc kubenswrapper[4735]: I1008 14:16:21.506909 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 14:16:21 crc kubenswrapper[4735]: I1008 14:16:21.963319 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09","Type":"ContainerStarted","Data":"32c82067fa0ee7c43307e04f44f83a5582cbfb9853555aeb2c157eb9fbd81131"} Oct 08 14:16:22 crc kubenswrapper[4735]: I1008 14:16:22.973529 4735 generic.go:334] "Generic (PLEG): container finished" podID="106c390b-13c5-4b0d-ab0a-3fc4a2dbab09" containerID="fb22436455832383d7d920ae6b0b71631440c86d1c397d54ee84e8279abb1c4b" exitCode=0 Oct 08 14:16:22 crc kubenswrapper[4735]: I1008 14:16:22.973584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09","Type":"ContainerDied","Data":"fb22436455832383d7d920ae6b0b71631440c86d1c397d54ee84e8279abb1c4b"} Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.410802 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.450667 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_106c390b-13c5-4b0d-ab0a-3fc4a2dbab09/mariadb-client-1-default/0.log" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.487133 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.494839 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.569914 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsk5d\" (UniqueName: \"kubernetes.io/projected/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09-kube-api-access-vsk5d\") pod \"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09\" (UID: \"106c390b-13c5-4b0d-ab0a-3fc4a2dbab09\") " Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.575866 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09-kube-api-access-vsk5d" (OuterVolumeSpecName: "kube-api-access-vsk5d") pod "106c390b-13c5-4b0d-ab0a-3fc4a2dbab09" (UID: "106c390b-13c5-4b0d-ab0a-3fc4a2dbab09"). InnerVolumeSpecName "kube-api-access-vsk5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.671917 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsk5d\" (UniqueName: \"kubernetes.io/projected/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09-kube-api-access-vsk5d\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.913087 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106c390b-13c5-4b0d-ab0a-3fc4a2dbab09" path="/var/lib/kubelet/pods/106c390b-13c5-4b0d-ab0a-3fc4a2dbab09/volumes" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.945545 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 14:16:24 crc kubenswrapper[4735]: E1008 14:16:24.946118 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106c390b-13c5-4b0d-ab0a-3fc4a2dbab09" containerName="mariadb-client-1-default" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.946149 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="106c390b-13c5-4b0d-ab0a-3fc4a2dbab09" containerName="mariadb-client-1-default" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.946434 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="106c390b-13c5-4b0d-ab0a-3fc4a2dbab09" containerName="mariadb-client-1-default" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.947322 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.971936 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.996485 4735 scope.go:117] "RemoveContainer" containerID="fb22436455832383d7d920ae6b0b71631440c86d1c397d54ee84e8279abb1c4b" Oct 08 14:16:24 crc kubenswrapper[4735]: I1008 14:16:24.996540 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 14:16:25 crc kubenswrapper[4735]: I1008 14:16:25.078883 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lzvn\" (UniqueName: \"kubernetes.io/projected/636ba181-092c-4d44-9d01-0655fa45c65e-kube-api-access-6lzvn\") pod \"mariadb-client-2-default\" (UID: \"636ba181-092c-4d44-9d01-0655fa45c65e\") " pod="openstack/mariadb-client-2-default" Oct 08 14:16:25 crc kubenswrapper[4735]: I1008 14:16:25.181040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lzvn\" (UniqueName: \"kubernetes.io/projected/636ba181-092c-4d44-9d01-0655fa45c65e-kube-api-access-6lzvn\") pod \"mariadb-client-2-default\" (UID: \"636ba181-092c-4d44-9d01-0655fa45c65e\") " pod="openstack/mariadb-client-2-default" Oct 08 14:16:25 crc kubenswrapper[4735]: I1008 14:16:25.196833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lzvn\" (UniqueName: \"kubernetes.io/projected/636ba181-092c-4d44-9d01-0655fa45c65e-kube-api-access-6lzvn\") pod \"mariadb-client-2-default\" (UID: \"636ba181-092c-4d44-9d01-0655fa45c65e\") " pod="openstack/mariadb-client-2-default" Oct 08 14:16:25 crc kubenswrapper[4735]: I1008 14:16:25.277943 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 14:16:25 crc kubenswrapper[4735]: I1008 14:16:25.887115 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 14:16:25 crc kubenswrapper[4735]: W1008 14:16:25.896993 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636ba181_092c_4d44_9d01_0655fa45c65e.slice/crio-6c3abd8ff17f1917847e14891ba054302d7e475fe52d1b2ad7a659c157d8ecec WatchSource:0}: Error finding container 6c3abd8ff17f1917847e14891ba054302d7e475fe52d1b2ad7a659c157d8ecec: Status 404 returned error can't find the container with id 6c3abd8ff17f1917847e14891ba054302d7e475fe52d1b2ad7a659c157d8ecec Oct 08 14:16:26 crc kubenswrapper[4735]: I1008 14:16:26.005374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"636ba181-092c-4d44-9d01-0655fa45c65e","Type":"ContainerStarted","Data":"6c3abd8ff17f1917847e14891ba054302d7e475fe52d1b2ad7a659c157d8ecec"} Oct 08 14:16:27 crc kubenswrapper[4735]: I1008 14:16:27.014876 4735 generic.go:334] "Generic (PLEG): container finished" podID="636ba181-092c-4d44-9d01-0655fa45c65e" containerID="eb13c670bd5cdc4d01ede19cf238b2c44f050d7a16dbabd7a6872e5f39433ac1" exitCode=0 Oct 08 14:16:27 crc kubenswrapper[4735]: I1008 14:16:27.014921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"636ba181-092c-4d44-9d01-0655fa45c65e","Type":"ContainerDied","Data":"eb13c670bd5cdc4d01ede19cf238b2c44f050d7a16dbabd7a6872e5f39433ac1"} Oct 08 14:16:27 crc kubenswrapper[4735]: I1008 14:16:27.895588 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:16:27 crc kubenswrapper[4735]: E1008 14:16:27.896588 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.392237 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.423647 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_636ba181-092c-4d44-9d01-0655fa45c65e/mariadb-client-2-default/0.log" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.450604 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.454622 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.533298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lzvn\" (UniqueName: \"kubernetes.io/projected/636ba181-092c-4d44-9d01-0655fa45c65e-kube-api-access-6lzvn\") pod \"636ba181-092c-4d44-9d01-0655fa45c65e\" (UID: \"636ba181-092c-4d44-9d01-0655fa45c65e\") " Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.542556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636ba181-092c-4d44-9d01-0655fa45c65e-kube-api-access-6lzvn" (OuterVolumeSpecName: "kube-api-access-6lzvn") pod "636ba181-092c-4d44-9d01-0655fa45c65e" (UID: "636ba181-092c-4d44-9d01-0655fa45c65e"). InnerVolumeSpecName "kube-api-access-6lzvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.635836 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lzvn\" (UniqueName: \"kubernetes.io/projected/636ba181-092c-4d44-9d01-0655fa45c65e-kube-api-access-6lzvn\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.917997 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636ba181-092c-4d44-9d01-0655fa45c65e" path="/var/lib/kubelet/pods/636ba181-092c-4d44-9d01-0655fa45c65e/volumes" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.918965 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 08 14:16:28 crc kubenswrapper[4735]: E1008 14:16:28.919357 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636ba181-092c-4d44-9d01-0655fa45c65e" containerName="mariadb-client-2-default" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.919385 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="636ba181-092c-4d44-9d01-0655fa45c65e" containerName="mariadb-client-2-default" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.919690 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="636ba181-092c-4d44-9d01-0655fa45c65e" containerName="mariadb-client-2-default" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.920941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 14:16:28 crc kubenswrapper[4735]: I1008 14:16:28.928341 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.038344 4735 scope.go:117] "RemoveContainer" containerID="eb13c670bd5cdc4d01ede19cf238b2c44f050d7a16dbabd7a6872e5f39433ac1" Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.038461 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.043890 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwfm\" (UniqueName: \"kubernetes.io/projected/5799dbd3-6778-4c26-8e89-9e27b2338363-kube-api-access-mvwfm\") pod \"mariadb-client-1\" (UID: \"5799dbd3-6778-4c26-8e89-9e27b2338363\") " pod="openstack/mariadb-client-1" Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.146015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwfm\" (UniqueName: \"kubernetes.io/projected/5799dbd3-6778-4c26-8e89-9e27b2338363-kube-api-access-mvwfm\") pod \"mariadb-client-1\" (UID: \"5799dbd3-6778-4c26-8e89-9e27b2338363\") " pod="openstack/mariadb-client-1" Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.169111 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwfm\" (UniqueName: \"kubernetes.io/projected/5799dbd3-6778-4c26-8e89-9e27b2338363-kube-api-access-mvwfm\") pod \"mariadb-client-1\" (UID: \"5799dbd3-6778-4c26-8e89-9e27b2338363\") " pod="openstack/mariadb-client-1" Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.255755 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 14:16:29 crc kubenswrapper[4735]: I1008 14:16:29.565183 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 14:16:29 crc kubenswrapper[4735]: W1008 14:16:29.566366 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5799dbd3_6778_4c26_8e89_9e27b2338363.slice/crio-8ca6fe2102f3f43ea74fee7b2ffd4b573e1cb9526f7c243dc28ee1c70df6d1e6 WatchSource:0}: Error finding container 8ca6fe2102f3f43ea74fee7b2ffd4b573e1cb9526f7c243dc28ee1c70df6d1e6: Status 404 returned error can't find the container with id 8ca6fe2102f3f43ea74fee7b2ffd4b573e1cb9526f7c243dc28ee1c70df6d1e6 Oct 08 14:16:30 crc kubenswrapper[4735]: I1008 14:16:30.055597 4735 generic.go:334] "Generic (PLEG): container finished" podID="5799dbd3-6778-4c26-8e89-9e27b2338363" containerID="b8e3804282a936819b3b77811127b440386c29c03adb1a7a0869f338f775a374" exitCode=0 Oct 08 14:16:30 crc kubenswrapper[4735]: I1008 14:16:30.055699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"5799dbd3-6778-4c26-8e89-9e27b2338363","Type":"ContainerDied","Data":"b8e3804282a936819b3b77811127b440386c29c03adb1a7a0869f338f775a374"} Oct 08 14:16:30 crc kubenswrapper[4735]: I1008 14:16:30.056633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"5799dbd3-6778-4c26-8e89-9e27b2338363","Type":"ContainerStarted","Data":"8ca6fe2102f3f43ea74fee7b2ffd4b573e1cb9526f7c243dc28ee1c70df6d1e6"} Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.452939 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.472347 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_5799dbd3-6778-4c26-8e89-9e27b2338363/mariadb-client-1/0.log" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.504282 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.511657 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.581317 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwfm\" (UniqueName: \"kubernetes.io/projected/5799dbd3-6778-4c26-8e89-9e27b2338363-kube-api-access-mvwfm\") pod \"5799dbd3-6778-4c26-8e89-9e27b2338363\" (UID: \"5799dbd3-6778-4c26-8e89-9e27b2338363\") " Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.590293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5799dbd3-6778-4c26-8e89-9e27b2338363-kube-api-access-mvwfm" (OuterVolumeSpecName: "kube-api-access-mvwfm") pod "5799dbd3-6778-4c26-8e89-9e27b2338363" (UID: "5799dbd3-6778-4c26-8e89-9e27b2338363"). InnerVolumeSpecName "kube-api-access-mvwfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.683776 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwfm\" (UniqueName: \"kubernetes.io/projected/5799dbd3-6778-4c26-8e89-9e27b2338363-kube-api-access-mvwfm\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.993385 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 14:16:31 crc kubenswrapper[4735]: E1008 14:16:31.993917 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5799dbd3-6778-4c26-8e89-9e27b2338363" containerName="mariadb-client-1" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.993944 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5799dbd3-6778-4c26-8e89-9e27b2338363" containerName="mariadb-client-1" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.994245 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5799dbd3-6778-4c26-8e89-9e27b2338363" containerName="mariadb-client-1" Oct 08 14:16:31 crc kubenswrapper[4735]: I1008 14:16:31.995098 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.003267 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.080127 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ca6fe2102f3f43ea74fee7b2ffd4b573e1cb9526f7c243dc28ee1c70df6d1e6" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.080193 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.090024 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cqp8\" (UniqueName: \"kubernetes.io/projected/03d4389f-ba63-48b6-be72-6a93c172f641-kube-api-access-7cqp8\") pod \"mariadb-client-4-default\" (UID: \"03d4389f-ba63-48b6-be72-6a93c172f641\") " pod="openstack/mariadb-client-4-default" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.191694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cqp8\" (UniqueName: \"kubernetes.io/projected/03d4389f-ba63-48b6-be72-6a93c172f641-kube-api-access-7cqp8\") pod \"mariadb-client-4-default\" (UID: \"03d4389f-ba63-48b6-be72-6a93c172f641\") " pod="openstack/mariadb-client-4-default" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.213820 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cqp8\" (UniqueName: \"kubernetes.io/projected/03d4389f-ba63-48b6-be72-6a93c172f641-kube-api-access-7cqp8\") pod \"mariadb-client-4-default\" (UID: \"03d4389f-ba63-48b6-be72-6a93c172f641\") " pod="openstack/mariadb-client-4-default" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.324119 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.844014 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 14:16:32 crc kubenswrapper[4735]: W1008 14:16:32.851572 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03d4389f_ba63_48b6_be72_6a93c172f641.slice/crio-d1eb595fbc5d14d7a17c3d8204deb2d0ea3fc42aa867a6d5be1b8b9032a592c3 WatchSource:0}: Error finding container d1eb595fbc5d14d7a17c3d8204deb2d0ea3fc42aa867a6d5be1b8b9032a592c3: Status 404 returned error can't find the container with id d1eb595fbc5d14d7a17c3d8204deb2d0ea3fc42aa867a6d5be1b8b9032a592c3 Oct 08 14:16:32 crc kubenswrapper[4735]: I1008 14:16:32.909586 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5799dbd3-6778-4c26-8e89-9e27b2338363" path="/var/lib/kubelet/pods/5799dbd3-6778-4c26-8e89-9e27b2338363/volumes" Oct 08 14:16:33 crc kubenswrapper[4735]: I1008 14:16:33.088352 4735 generic.go:334] "Generic (PLEG): container finished" podID="03d4389f-ba63-48b6-be72-6a93c172f641" containerID="9354eec4eb91a123a15e6f85760419e5c89f67d8577a225bd40c550d030a9bf8" exitCode=0 Oct 08 14:16:33 crc kubenswrapper[4735]: I1008 14:16:33.088443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"03d4389f-ba63-48b6-be72-6a93c172f641","Type":"ContainerDied","Data":"9354eec4eb91a123a15e6f85760419e5c89f67d8577a225bd40c550d030a9bf8"} Oct 08 14:16:33 crc kubenswrapper[4735]: I1008 14:16:33.088802 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"03d4389f-ba63-48b6-be72-6a93c172f641","Type":"ContainerStarted","Data":"d1eb595fbc5d14d7a17c3d8204deb2d0ea3fc42aa867a6d5be1b8b9032a592c3"} Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.461218 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.480065 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_03d4389f-ba63-48b6-be72-6a93c172f641/mariadb-client-4-default/0.log" Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.514563 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.518719 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.531539 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cqp8\" (UniqueName: \"kubernetes.io/projected/03d4389f-ba63-48b6-be72-6a93c172f641-kube-api-access-7cqp8\") pod \"03d4389f-ba63-48b6-be72-6a93c172f641\" (UID: \"03d4389f-ba63-48b6-be72-6a93c172f641\") " Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.539245 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d4389f-ba63-48b6-be72-6a93c172f641-kube-api-access-7cqp8" (OuterVolumeSpecName: "kube-api-access-7cqp8") pod "03d4389f-ba63-48b6-be72-6a93c172f641" (UID: "03d4389f-ba63-48b6-be72-6a93c172f641"). InnerVolumeSpecName "kube-api-access-7cqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.632777 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cqp8\" (UniqueName: \"kubernetes.io/projected/03d4389f-ba63-48b6-be72-6a93c172f641-kube-api-access-7cqp8\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:34 crc kubenswrapper[4735]: I1008 14:16:34.909957 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d4389f-ba63-48b6-be72-6a93c172f641" path="/var/lib/kubelet/pods/03d4389f-ba63-48b6-be72-6a93c172f641/volumes" Oct 08 14:16:35 crc kubenswrapper[4735]: I1008 14:16:35.106104 4735 scope.go:117] "RemoveContainer" containerID="9354eec4eb91a123a15e6f85760419e5c89f67d8577a225bd40c550d030a9bf8" Oct 08 14:16:35 crc kubenswrapper[4735]: I1008 14:16:35.106445 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.805901 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 14:16:38 crc kubenswrapper[4735]: E1008 14:16:38.807607 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d4389f-ba63-48b6-be72-6a93c172f641" containerName="mariadb-client-4-default" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.807706 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d4389f-ba63-48b6-be72-6a93c172f641" containerName="mariadb-client-4-default" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.807938 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d4389f-ba63-48b6-be72-6a93c172f641" containerName="mariadb-client-4-default" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.810008 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.812644 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ss8tp" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.816234 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.894244 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:16:38 crc kubenswrapper[4735]: E1008 14:16:38.894506 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:16:38 crc kubenswrapper[4735]: I1008 14:16:38.899895 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcttx\" (UniqueName: \"kubernetes.io/projected/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344-kube-api-access-vcttx\") pod \"mariadb-client-5-default\" (UID: \"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344\") " pod="openstack/mariadb-client-5-default" Oct 08 14:16:39 crc kubenswrapper[4735]: I1008 14:16:39.001776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcttx\" (UniqueName: \"kubernetes.io/projected/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344-kube-api-access-vcttx\") pod \"mariadb-client-5-default\" (UID: \"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344\") " pod="openstack/mariadb-client-5-default" Oct 08 14:16:39 crc kubenswrapper[4735]: I1008 14:16:39.019928 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcttx\" (UniqueName: \"kubernetes.io/projected/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344-kube-api-access-vcttx\") pod \"mariadb-client-5-default\" (UID: \"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344\") " pod="openstack/mariadb-client-5-default" Oct 08 14:16:39 crc kubenswrapper[4735]: I1008 14:16:39.132604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 14:16:39 crc kubenswrapper[4735]: I1008 14:16:39.604299 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 14:16:40 crc kubenswrapper[4735]: I1008 14:16:40.154757 4735 generic.go:334] "Generic (PLEG): container finished" podID="5b03c5c5-a1b4-4f7b-95ca-a37d683a2344" containerID="eb4baaed5dd84fb5d77403d2b189778946b2017028be44318bd1b02c33c5d2e0" exitCode=0 Oct 08 14:16:40 crc kubenswrapper[4735]: I1008 14:16:40.154952 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344","Type":"ContainerDied","Data":"eb4baaed5dd84fb5d77403d2b189778946b2017028be44318bd1b02c33c5d2e0"} Oct 08 14:16:40 crc kubenswrapper[4735]: I1008 14:16:40.155307 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344","Type":"ContainerStarted","Data":"ebf7ced2fd5ac95a4aa1d755e7d0086b2bebc42791bfb400ea2e320f67d189a0"} Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.640609 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.664038 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_5b03c5c5-a1b4-4f7b-95ca-a37d683a2344/mariadb-client-5-default/0.log" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.693339 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.701213 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.816544 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 14:16:41 crc kubenswrapper[4735]: E1008 14:16:41.817079 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b03c5c5-a1b4-4f7b-95ca-a37d683a2344" containerName="mariadb-client-5-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.817107 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b03c5c5-a1b4-4f7b-95ca-a37d683a2344" containerName="mariadb-client-5-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.817439 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b03c5c5-a1b4-4f7b-95ca-a37d683a2344" containerName="mariadb-client-5-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.818363 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.825438 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.841410 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcttx\" (UniqueName: \"kubernetes.io/projected/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344-kube-api-access-vcttx\") pod \"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344\" (UID: \"5b03c5c5-a1b4-4f7b-95ca-a37d683a2344\") " Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.842259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9stff\" (UniqueName: \"kubernetes.io/projected/5fc83810-3b34-48b1-a738-4e33bc7c3ecf-kube-api-access-9stff\") pod \"mariadb-client-6-default\" (UID: \"5fc83810-3b34-48b1-a738-4e33bc7c3ecf\") " pod="openstack/mariadb-client-6-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.849715 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344-kube-api-access-vcttx" (OuterVolumeSpecName: "kube-api-access-vcttx") pod "5b03c5c5-a1b4-4f7b-95ca-a37d683a2344" (UID: "5b03c5c5-a1b4-4f7b-95ca-a37d683a2344"). InnerVolumeSpecName "kube-api-access-vcttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.943464 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9stff\" (UniqueName: \"kubernetes.io/projected/5fc83810-3b34-48b1-a738-4e33bc7c3ecf-kube-api-access-9stff\") pod \"mariadb-client-6-default\" (UID: \"5fc83810-3b34-48b1-a738-4e33bc7c3ecf\") " pod="openstack/mariadb-client-6-default" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.943588 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcttx\" (UniqueName: \"kubernetes.io/projected/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344-kube-api-access-vcttx\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:41 crc kubenswrapper[4735]: I1008 14:16:41.967900 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9stff\" (UniqueName: \"kubernetes.io/projected/5fc83810-3b34-48b1-a738-4e33bc7c3ecf-kube-api-access-9stff\") pod \"mariadb-client-6-default\" (UID: \"5fc83810-3b34-48b1-a738-4e33bc7c3ecf\") " pod="openstack/mariadb-client-6-default" Oct 08 14:16:42 crc kubenswrapper[4735]: I1008 14:16:42.142243 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 14:16:42 crc kubenswrapper[4735]: I1008 14:16:42.176038 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebf7ced2fd5ac95a4aa1d755e7d0086b2bebc42791bfb400ea2e320f67d189a0" Oct 08 14:16:42 crc kubenswrapper[4735]: I1008 14:16:42.176097 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 14:16:42 crc kubenswrapper[4735]: I1008 14:16:42.714642 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 14:16:42 crc kubenswrapper[4735]: W1008 14:16:42.719810 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fc83810_3b34_48b1_a738_4e33bc7c3ecf.slice/crio-24d2e7560d5e8c856af957c7386c208a05389bf57109f0b041fe7411f69a037c WatchSource:0}: Error finding container 24d2e7560d5e8c856af957c7386c208a05389bf57109f0b041fe7411f69a037c: Status 404 returned error can't find the container with id 24d2e7560d5e8c856af957c7386c208a05389bf57109f0b041fe7411f69a037c Oct 08 14:16:42 crc kubenswrapper[4735]: I1008 14:16:42.910035 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b03c5c5-a1b4-4f7b-95ca-a37d683a2344" path="/var/lib/kubelet/pods/5b03c5c5-a1b4-4f7b-95ca-a37d683a2344/volumes" Oct 08 14:16:43 crc kubenswrapper[4735]: I1008 14:16:43.184344 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5fc83810-3b34-48b1-a738-4e33bc7c3ecf","Type":"ContainerStarted","Data":"bc71e6776f2fd9437e4b5b175132e76da479f28c56fedfd1da7ebdb3e5c4754d"} Oct 08 14:16:43 crc kubenswrapper[4735]: I1008 14:16:43.184660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5fc83810-3b34-48b1-a738-4e33bc7c3ecf","Type":"ContainerStarted","Data":"24d2e7560d5e8c856af957c7386c208a05389bf57109f0b041fe7411f69a037c"} Oct 08 14:16:43 crc kubenswrapper[4735]: I1008 14:16:43.208127 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.208106341 podStartE2EDuration="2.208106341s" podCreationTimestamp="2025-10-08 14:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:16:43.198628229 +0000 UTC m=+4853.006932783" watchObservedRunningTime="2025-10-08 14:16:43.208106341 +0000 UTC m=+4853.016410905" Oct 08 14:16:44 crc kubenswrapper[4735]: I1008 14:16:44.195548 4735 generic.go:334] "Generic (PLEG): container finished" podID="5fc83810-3b34-48b1-a738-4e33bc7c3ecf" containerID="bc71e6776f2fd9437e4b5b175132e76da479f28c56fedfd1da7ebdb3e5c4754d" exitCode=0 Oct 08 14:16:44 crc kubenswrapper[4735]: I1008 14:16:44.195583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5fc83810-3b34-48b1-a738-4e33bc7c3ecf","Type":"ContainerDied","Data":"bc71e6776f2fd9437e4b5b175132e76da479f28c56fedfd1da7ebdb3e5c4754d"} Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.542334 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.579402 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.579501 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.600903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9stff\" (UniqueName: \"kubernetes.io/projected/5fc83810-3b34-48b1-a738-4e33bc7c3ecf-kube-api-access-9stff\") pod \"5fc83810-3b34-48b1-a738-4e33bc7c3ecf\" (UID: \"5fc83810-3b34-48b1-a738-4e33bc7c3ecf\") " Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.606004 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc83810-3b34-48b1-a738-4e33bc7c3ecf-kube-api-access-9stff" (OuterVolumeSpecName: "kube-api-access-9stff") pod "5fc83810-3b34-48b1-a738-4e33bc7c3ecf" (UID: "5fc83810-3b34-48b1-a738-4e33bc7c3ecf"). InnerVolumeSpecName "kube-api-access-9stff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.703511 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9stff\" (UniqueName: \"kubernetes.io/projected/5fc83810-3b34-48b1-a738-4e33bc7c3ecf-kube-api-access-9stff\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.714550 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 14:16:45 crc kubenswrapper[4735]: E1008 14:16:45.715333 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc83810-3b34-48b1-a738-4e33bc7c3ecf" containerName="mariadb-client-6-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.715509 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc83810-3b34-48b1-a738-4e33bc7c3ecf" containerName="mariadb-client-6-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.715938 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc83810-3b34-48b1-a738-4e33bc7c3ecf" containerName="mariadb-client-6-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.716854 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.724544 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.804743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjr9\" (UniqueName: \"kubernetes.io/projected/b99d7bbd-73e8-4362-9845-dc1443af6fb8-kube-api-access-8mjr9\") pod \"mariadb-client-7-default\" (UID: \"b99d7bbd-73e8-4362-9845-dc1443af6fb8\") " pod="openstack/mariadb-client-7-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.906816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjr9\" (UniqueName: \"kubernetes.io/projected/b99d7bbd-73e8-4362-9845-dc1443af6fb8-kube-api-access-8mjr9\") pod \"mariadb-client-7-default\" (UID: \"b99d7bbd-73e8-4362-9845-dc1443af6fb8\") " pod="openstack/mariadb-client-7-default" Oct 08 14:16:45 crc kubenswrapper[4735]: I1008 14:16:45.925358 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjr9\" (UniqueName: \"kubernetes.io/projected/b99d7bbd-73e8-4362-9845-dc1443af6fb8-kube-api-access-8mjr9\") pod \"mariadb-client-7-default\" (UID: \"b99d7bbd-73e8-4362-9845-dc1443af6fb8\") " pod="openstack/mariadb-client-7-default" Oct 08 14:16:46 crc kubenswrapper[4735]: I1008 14:16:46.042251 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 14:16:46 crc kubenswrapper[4735]: I1008 14:16:46.215284 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d2e7560d5e8c856af957c7386c208a05389bf57109f0b041fe7411f69a037c" Oct 08 14:16:46 crc kubenswrapper[4735]: I1008 14:16:46.215640 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 14:16:46 crc kubenswrapper[4735]: W1008 14:16:46.582466 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb99d7bbd_73e8_4362_9845_dc1443af6fb8.slice/crio-094cd600951ee9440ebcf3bfa2e8021dc91cac2691bc63867f740c4ceb2e217f WatchSource:0}: Error finding container 094cd600951ee9440ebcf3bfa2e8021dc91cac2691bc63867f740c4ceb2e217f: Status 404 returned error can't find the container with id 094cd600951ee9440ebcf3bfa2e8021dc91cac2691bc63867f740c4ceb2e217f Oct 08 14:16:46 crc kubenswrapper[4735]: I1008 14:16:46.584392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 14:16:46 crc kubenswrapper[4735]: I1008 14:16:46.906828 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc83810-3b34-48b1-a738-4e33bc7c3ecf" path="/var/lib/kubelet/pods/5fc83810-3b34-48b1-a738-4e33bc7c3ecf/volumes" Oct 08 14:16:47 crc kubenswrapper[4735]: I1008 14:16:47.228028 4735 generic.go:334] "Generic (PLEG): container finished" podID="b99d7bbd-73e8-4362-9845-dc1443af6fb8" containerID="8595b70ef5513a6b416c2fbd296c80e0cc757406bb8d9cedb81a102c8c6faa0c" exitCode=0 Oct 08 14:16:47 crc kubenswrapper[4735]: I1008 14:16:47.228065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"b99d7bbd-73e8-4362-9845-dc1443af6fb8","Type":"ContainerDied","Data":"8595b70ef5513a6b416c2fbd296c80e0cc757406bb8d9cedb81a102c8c6faa0c"} Oct 08 14:16:47 crc kubenswrapper[4735]: I1008 14:16:47.228090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"b99d7bbd-73e8-4362-9845-dc1443af6fb8","Type":"ContainerStarted","Data":"094cd600951ee9440ebcf3bfa2e8021dc91cac2691bc63867f740c4ceb2e217f"} Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.608826 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.626112 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_b99d7bbd-73e8-4362-9845-dc1443af6fb8/mariadb-client-7-default/0.log" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.651055 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.655319 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.752568 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mjr9\" (UniqueName: \"kubernetes.io/projected/b99d7bbd-73e8-4362-9845-dc1443af6fb8-kube-api-access-8mjr9\") pod \"b99d7bbd-73e8-4362-9845-dc1443af6fb8\" (UID: \"b99d7bbd-73e8-4362-9845-dc1443af6fb8\") " Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.757359 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99d7bbd-73e8-4362-9845-dc1443af6fb8-kube-api-access-8mjr9" (OuterVolumeSpecName: "kube-api-access-8mjr9") pod "b99d7bbd-73e8-4362-9845-dc1443af6fb8" (UID: "b99d7bbd-73e8-4362-9845-dc1443af6fb8"). InnerVolumeSpecName "kube-api-access-8mjr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.804430 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 08 14:16:50 crc kubenswrapper[4735]: E1008 14:16:48.804795 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99d7bbd-73e8-4362-9845-dc1443af6fb8" containerName="mariadb-client-7-default" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.804808 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99d7bbd-73e8-4362-9845-dc1443af6fb8" containerName="mariadb-client-7-default" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.804963 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99d7bbd-73e8-4362-9845-dc1443af6fb8" containerName="mariadb-client-7-default" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.805436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.810291 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.854699 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mjr9\" (UniqueName: \"kubernetes.io/projected/b99d7bbd-73e8-4362-9845-dc1443af6fb8-kube-api-access-8mjr9\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.906287 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99d7bbd-73e8-4362-9845-dc1443af6fb8" path="/var/lib/kubelet/pods/b99d7bbd-73e8-4362-9845-dc1443af6fb8/volumes" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:48.956777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9r4z\" (UniqueName: \"kubernetes.io/projected/abe8fe3a-2b22-4975-b377-cb1e6977c3bc-kube-api-access-k9r4z\") pod \"mariadb-client-2\" (UID: \"abe8fe3a-2b22-4975-b377-cb1e6977c3bc\") " pod="openstack/mariadb-client-2" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:49.058109 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9r4z\" (UniqueName: \"kubernetes.io/projected/abe8fe3a-2b22-4975-b377-cb1e6977c3bc-kube-api-access-k9r4z\") pod \"mariadb-client-2\" (UID: \"abe8fe3a-2b22-4975-b377-cb1e6977c3bc\") " pod="openstack/mariadb-client-2" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:49.074784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9r4z\" (UniqueName: \"kubernetes.io/projected/abe8fe3a-2b22-4975-b377-cb1e6977c3bc-kube-api-access-k9r4z\") pod \"mariadb-client-2\" (UID: \"abe8fe3a-2b22-4975-b377-cb1e6977c3bc\") " pod="openstack/mariadb-client-2" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:49.130872 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:49.247450 4735 scope.go:117] "RemoveContainer" containerID="8595b70ef5513a6b416c2fbd296c80e0cc757406bb8d9cedb81a102c8c6faa0c" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:49.247559 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 14:16:50 crc kubenswrapper[4735]: I1008 14:16:50.818154 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 14:16:50 crc kubenswrapper[4735]: W1008 14:16:50.823414 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe8fe3a_2b22_4975_b377_cb1e6977c3bc.slice/crio-4408c2aa69063ff40fdd3f16825f83e8119c2b93c4c87de09625b2e9a3b6e04c WatchSource:0}: Error finding container 4408c2aa69063ff40fdd3f16825f83e8119c2b93c4c87de09625b2e9a3b6e04c: Status 404 returned error can't find the container with id 4408c2aa69063ff40fdd3f16825f83e8119c2b93c4c87de09625b2e9a3b6e04c Oct 08 14:16:51 crc kubenswrapper[4735]: I1008 14:16:51.265860 4735 generic.go:334] "Generic (PLEG): container finished" podID="abe8fe3a-2b22-4975-b377-cb1e6977c3bc" containerID="ad6965c51ed06d14f640da0fbb44a13166c69a247dc98c1ffdc28de4d6d076ef" exitCode=0 Oct 08 14:16:51 crc kubenswrapper[4735]: I1008 14:16:51.266170 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"abe8fe3a-2b22-4975-b377-cb1e6977c3bc","Type":"ContainerDied","Data":"ad6965c51ed06d14f640da0fbb44a13166c69a247dc98c1ffdc28de4d6d076ef"} Oct 08 14:16:51 crc kubenswrapper[4735]: I1008 14:16:51.266200 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"abe8fe3a-2b22-4975-b377-cb1e6977c3bc","Type":"ContainerStarted","Data":"4408c2aa69063ff40fdd3f16825f83e8119c2b93c4c87de09625b2e9a3b6e04c"} Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.654966 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.675533 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_abe8fe3a-2b22-4975-b377-cb1e6977c3bc/mariadb-client-2/0.log" Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.700044 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.704903 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.821133 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9r4z\" (UniqueName: \"kubernetes.io/projected/abe8fe3a-2b22-4975-b377-cb1e6977c3bc-kube-api-access-k9r4z\") pod \"abe8fe3a-2b22-4975-b377-cb1e6977c3bc\" (UID: \"abe8fe3a-2b22-4975-b377-cb1e6977c3bc\") " Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.826439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe8fe3a-2b22-4975-b377-cb1e6977c3bc-kube-api-access-k9r4z" (OuterVolumeSpecName: "kube-api-access-k9r4z") pod "abe8fe3a-2b22-4975-b377-cb1e6977c3bc" (UID: "abe8fe3a-2b22-4975-b377-cb1e6977c3bc"). InnerVolumeSpecName "kube-api-access-k9r4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.895820 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:16:52 crc kubenswrapper[4735]: E1008 14:16:52.896351 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.912326 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe8fe3a-2b22-4975-b377-cb1e6977c3bc" path="/var/lib/kubelet/pods/abe8fe3a-2b22-4975-b377-cb1e6977c3bc/volumes" Oct 08 14:16:52 crc kubenswrapper[4735]: I1008 14:16:52.924336 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9r4z\" (UniqueName: \"kubernetes.io/projected/abe8fe3a-2b22-4975-b377-cb1e6977c3bc-kube-api-access-k9r4z\") on node \"crc\" DevicePath \"\"" Oct 08 14:16:53 crc kubenswrapper[4735]: I1008 14:16:53.282009 4735 scope.go:117] "RemoveContainer" containerID="ad6965c51ed06d14f640da0fbb44a13166c69a247dc98c1ffdc28de4d6d076ef" Oct 08 14:16:53 crc kubenswrapper[4735]: I1008 14:16:53.282195 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 14:17:03 crc kubenswrapper[4735]: I1008 14:17:03.895659 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:17:04 crc kubenswrapper[4735]: I1008 14:17:04.371777 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"cec388ee120153babb4ba1df9c16c317b59b98c70ffddd569ed770c55e343565"} Oct 08 14:17:07 crc kubenswrapper[4735]: I1008 14:17:07.776117 4735 scope.go:117] "RemoveContainer" containerID="c2c67fc7fbfe356c42708fd91c43a7a1206958e1b8a49de7303c7cf69c1732db" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.845020 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cpnht"] Oct 08 14:18:16 crc kubenswrapper[4735]: E1008 14:18:16.846080 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe8fe3a-2b22-4975-b377-cb1e6977c3bc" containerName="mariadb-client-2" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.846104 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe8fe3a-2b22-4975-b377-cb1e6977c3bc" containerName="mariadb-client-2" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.846417 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe8fe3a-2b22-4975-b377-cb1e6977c3bc" containerName="mariadb-client-2" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.848886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.860710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpnht"] Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.920785 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-utilities\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.920847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52kl8\" (UniqueName: \"kubernetes.io/projected/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-kube-api-access-52kl8\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:16 crc kubenswrapper[4735]: I1008 14:18:16.921071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-catalog-content\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.022722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-utilities\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.022788 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52kl8\" (UniqueName: \"kubernetes.io/projected/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-kube-api-access-52kl8\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.022863 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-catalog-content\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.023157 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-utilities\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.023207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-catalog-content\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.040841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52kl8\" (UniqueName: \"kubernetes.io/projected/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-kube-api-access-52kl8\") pod \"redhat-marketplace-cpnht\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.172860 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:17 crc kubenswrapper[4735]: I1008 14:18:17.589131 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpnht"] Oct 08 14:18:18 crc kubenswrapper[4735]: I1008 14:18:18.032671 4735 generic.go:334] "Generic (PLEG): container finished" podID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerID="954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1" exitCode=0 Oct 08 14:18:18 crc kubenswrapper[4735]: I1008 14:18:18.032718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpnht" event={"ID":"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388","Type":"ContainerDied","Data":"954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1"} Oct 08 14:18:18 crc kubenswrapper[4735]: I1008 14:18:18.032978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpnht" event={"ID":"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388","Type":"ContainerStarted","Data":"bce946e86c2317541505cd13c3b642499997776a53c09d148818f9e05f750a5c"} Oct 08 14:18:19 crc kubenswrapper[4735]: I1008 14:18:19.042766 4735 generic.go:334] "Generic (PLEG): container finished" podID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerID="5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084" exitCode=0 Oct 08 14:18:19 crc kubenswrapper[4735]: I1008 14:18:19.042828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpnht" event={"ID":"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388","Type":"ContainerDied","Data":"5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084"} Oct 08 14:18:20 crc kubenswrapper[4735]: I1008 14:18:20.060275 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpnht" event={"ID":"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388","Type":"ContainerStarted","Data":"7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416"} Oct 08 14:18:20 crc kubenswrapper[4735]: I1008 14:18:20.086215 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cpnht" podStartSLOduration=2.623244975 podStartE2EDuration="4.086195524s" podCreationTimestamp="2025-10-08 14:18:16 +0000 UTC" firstStartedPulling="2025-10-08 14:18:18.034221421 +0000 UTC m=+4947.842525965" lastFinishedPulling="2025-10-08 14:18:19.49717197 +0000 UTC m=+4949.305476514" observedRunningTime="2025-10-08 14:18:20.08213556 +0000 UTC m=+4949.890440104" watchObservedRunningTime="2025-10-08 14:18:20.086195524 +0000 UTC m=+4949.894500068" Oct 08 14:18:27 crc kubenswrapper[4735]: I1008 14:18:27.174050 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:27 crc kubenswrapper[4735]: I1008 14:18:27.174688 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:27 crc kubenswrapper[4735]: I1008 14:18:27.242881 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:28 crc kubenswrapper[4735]: I1008 14:18:28.171399 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:28 crc kubenswrapper[4735]: I1008 14:18:28.214579 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpnht"] Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.141741 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cpnht" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="registry-server" containerID="cri-o://7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416" gracePeriod=2 Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.534446 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.632131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-catalog-content\") pod \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.632237 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52kl8\" (UniqueName: \"kubernetes.io/projected/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-kube-api-access-52kl8\") pod \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.632348 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-utilities\") pod \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\" (UID: \"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388\") " Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.633305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-utilities" (OuterVolumeSpecName: "utilities") pod "956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" (UID: "956ce2cc-9b1b-4a68-a6e5-8136f1d4c388"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.638994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-kube-api-access-52kl8" (OuterVolumeSpecName: "kube-api-access-52kl8") pod "956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" (UID: "956ce2cc-9b1b-4a68-a6e5-8136f1d4c388"). InnerVolumeSpecName "kube-api-access-52kl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.646261 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" (UID: "956ce2cc-9b1b-4a68-a6e5-8136f1d4c388"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.734083 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.734148 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:18:30 crc kubenswrapper[4735]: I1008 14:18:30.734165 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52kl8\" (UniqueName: \"kubernetes.io/projected/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388-kube-api-access-52kl8\") on node \"crc\" DevicePath \"\"" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.176047 4735 generic.go:334] "Generic (PLEG): container finished" podID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerID="7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416" exitCode=0 Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.176092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpnht" event={"ID":"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388","Type":"ContainerDied","Data":"7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416"} Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.176119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpnht" event={"ID":"956ce2cc-9b1b-4a68-a6e5-8136f1d4c388","Type":"ContainerDied","Data":"bce946e86c2317541505cd13c3b642499997776a53c09d148818f9e05f750a5c"} Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.176140 4735 scope.go:117] "RemoveContainer" containerID="7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.176255 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpnht" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.202862 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpnht"] Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.204518 4735 scope.go:117] "RemoveContainer" containerID="5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.210646 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpnht"] Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.223067 4735 scope.go:117] "RemoveContainer" containerID="954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.265669 4735 scope.go:117] "RemoveContainer" containerID="7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416" Oct 08 14:18:31 crc kubenswrapper[4735]: E1008 14:18:31.266280 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416\": container with ID starting with 7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416 not found: ID does not exist" containerID="7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.266329 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416"} err="failed to get container status \"7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416\": rpc error: code = NotFound desc = could not find container \"7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416\": container with ID starting with 7d004b1c17c251684f58500da1bcc19cb7a1f04bb3df64624b55d9289bcb6416 not found: ID does not exist" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.266357 4735 scope.go:117] "RemoveContainer" containerID="5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084" Oct 08 14:18:31 crc kubenswrapper[4735]: E1008 14:18:31.266812 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084\": container with ID starting with 5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084 not found: ID does not exist" containerID="5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.266843 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084"} err="failed to get container status \"5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084\": rpc error: code = NotFound desc = could not find container \"5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084\": container with ID starting with 5c4b6adaf1f7c25c55faf21d7ca363b6e358868623033b885e22c9085af6e084 not found: ID does not exist" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.266864 4735 scope.go:117] "RemoveContainer" containerID="954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1" Oct 08 14:18:31 crc kubenswrapper[4735]: E1008 14:18:31.267114 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1\": container with ID starting with 954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1 not found: ID does not exist" containerID="954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1" Oct 08 14:18:31 crc kubenswrapper[4735]: I1008 14:18:31.267134 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1"} err="failed to get container status \"954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1\": rpc error: code = NotFound desc = could not find container \"954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1\": container with ID starting with 954748b6622ea68f60a81581d32bc387a78b1f83787b10893997edc92d4dcfb1 not found: ID does not exist" Oct 08 14:18:32 crc kubenswrapper[4735]: I1008 14:18:32.914356 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" path="/var/lib/kubelet/pods/956ce2cc-9b1b-4a68-a6e5-8136f1d4c388/volumes" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.489732 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dd648"] Oct 08 14:19:06 crc kubenswrapper[4735]: E1008 14:19:06.490575 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="extract-utilities" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.490593 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="extract-utilities" Oct 08 14:19:06 crc kubenswrapper[4735]: E1008 14:19:06.490612 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="registry-server" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.490622 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="registry-server" Oct 08 14:19:06 crc kubenswrapper[4735]: E1008 14:19:06.490631 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="extract-content" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.490640 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="extract-content" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.490787 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="956ce2cc-9b1b-4a68-a6e5-8136f1d4c388" containerName="registry-server" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.491876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.510016 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd648"] Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.627881 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gssv2\" (UniqueName: \"kubernetes.io/projected/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-kube-api-access-gssv2\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.628204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-utilities\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.628347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-catalog-content\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.729337 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gssv2\" (UniqueName: \"kubernetes.io/projected/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-kube-api-access-gssv2\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.729625 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-utilities\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.729761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-catalog-content\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.730275 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-utilities\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.730313 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-catalog-content\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.760889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gssv2\" (UniqueName: \"kubernetes.io/projected/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-kube-api-access-gssv2\") pod \"certified-operators-dd648\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:06 crc kubenswrapper[4735]: I1008 14:19:06.817172 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:07 crc kubenswrapper[4735]: I1008 14:19:07.301329 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd648"] Oct 08 14:19:07 crc kubenswrapper[4735]: I1008 14:19:07.475031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerStarted","Data":"dec7bd7c26634d62283924fc8911902aacf8ecca9224472e2762a34a3bb43f41"} Oct 08 14:19:07 crc kubenswrapper[4735]: I1008 14:19:07.475077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerStarted","Data":"c9dff9d5293298cbd6b85b9038eda4196e6255e03003d7ec72bce84819a67b5e"} Oct 08 14:19:08 crc kubenswrapper[4735]: I1008 14:19:08.484856 4735 generic.go:334] "Generic (PLEG): container finished" podID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerID="dec7bd7c26634d62283924fc8911902aacf8ecca9224472e2762a34a3bb43f41" exitCode=0 Oct 08 14:19:08 crc kubenswrapper[4735]: I1008 14:19:08.484916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerDied","Data":"dec7bd7c26634d62283924fc8911902aacf8ecca9224472e2762a34a3bb43f41"} Oct 08 14:19:08 crc kubenswrapper[4735]: I1008 14:19:08.486585 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.498309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerStarted","Data":"adb5a63d2007295a7c9e12c784f633a900072f687f3cfb5f916e2d7062bbfefc"} Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.693428 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nshkv"] Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.696402 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.710520 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nshkv"] Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.884997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-catalog-content\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.885112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-utilities\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.885159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh552\" (UniqueName: \"kubernetes.io/projected/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-kube-api-access-mh552\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.987016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-utilities\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.987100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh552\" (UniqueName: \"kubernetes.io/projected/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-kube-api-access-mh552\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.987152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-catalog-content\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.987701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-catalog-content\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:09 crc kubenswrapper[4735]: I1008 14:19:09.987819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-utilities\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:10 crc kubenswrapper[4735]: I1008 14:19:10.007969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh552\" (UniqueName: \"kubernetes.io/projected/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-kube-api-access-mh552\") pod \"redhat-operators-nshkv\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:10 crc kubenswrapper[4735]: I1008 14:19:10.024618 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:10 crc kubenswrapper[4735]: I1008 14:19:10.461218 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nshkv"] Oct 08 14:19:10 crc kubenswrapper[4735]: I1008 14:19:10.508380 4735 generic.go:334] "Generic (PLEG): container finished" podID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerID="adb5a63d2007295a7c9e12c784f633a900072f687f3cfb5f916e2d7062bbfefc" exitCode=0 Oct 08 14:19:10 crc kubenswrapper[4735]: I1008 14:19:10.508514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerDied","Data":"adb5a63d2007295a7c9e12c784f633a900072f687f3cfb5f916e2d7062bbfefc"} Oct 08 14:19:10 crc kubenswrapper[4735]: I1008 14:19:10.517251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerStarted","Data":"f5092a7e51debe40d7495d6d3c612a8da2463ac3f53ce51a2a75c0c6d4409483"} Oct 08 14:19:11 crc kubenswrapper[4735]: I1008 14:19:11.525410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerStarted","Data":"67c2d570ac6ebec2b661978f95adce2cc07ee917e205f7a370f27c39616e4dc6"} Oct 08 14:19:11 crc kubenswrapper[4735]: I1008 14:19:11.527283 4735 generic.go:334] "Generic (PLEG): container finished" podID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerID="93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6" exitCode=0 Oct 08 14:19:11 crc kubenswrapper[4735]: I1008 14:19:11.527324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerDied","Data":"93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6"} Oct 08 14:19:12 crc kubenswrapper[4735]: I1008 14:19:12.538160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerStarted","Data":"760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d"} Oct 08 14:19:12 crc kubenswrapper[4735]: I1008 14:19:12.563132 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dd648" podStartSLOduration=3.826663403 podStartE2EDuration="6.563110326s" podCreationTimestamp="2025-10-08 14:19:06 +0000 UTC" firstStartedPulling="2025-10-08 14:19:08.486187466 +0000 UTC m=+4998.294492010" lastFinishedPulling="2025-10-08 14:19:11.222634389 +0000 UTC m=+5001.030938933" observedRunningTime="2025-10-08 14:19:12.555164983 +0000 UTC m=+5002.363469537" watchObservedRunningTime="2025-10-08 14:19:12.563110326 +0000 UTC m=+5002.371414870" Oct 08 14:19:13 crc kubenswrapper[4735]: I1008 14:19:13.565439 4735 generic.go:334] "Generic (PLEG): container finished" podID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerID="760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d" exitCode=0 Oct 08 14:19:13 crc kubenswrapper[4735]: I1008 14:19:13.565628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerDied","Data":"760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d"} Oct 08 14:19:14 crc kubenswrapper[4735]: I1008 14:19:14.575379 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerStarted","Data":"11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304"} Oct 08 14:19:16 crc kubenswrapper[4735]: I1008 14:19:16.818049 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:16 crc kubenswrapper[4735]: I1008 14:19:16.818419 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:16 crc kubenswrapper[4735]: I1008 14:19:16.863848 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:16 crc kubenswrapper[4735]: I1008 14:19:16.879408 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nshkv" podStartSLOduration=5.177720923 podStartE2EDuration="7.879390936s" podCreationTimestamp="2025-10-08 14:19:09 +0000 UTC" firstStartedPulling="2025-10-08 14:19:11.530244397 +0000 UTC m=+5001.338548941" lastFinishedPulling="2025-10-08 14:19:14.23191441 +0000 UTC m=+5004.040218954" observedRunningTime="2025-10-08 14:19:14.591494758 +0000 UTC m=+5004.399799302" watchObservedRunningTime="2025-10-08 14:19:16.879390936 +0000 UTC m=+5006.687695480" Oct 08 14:19:17 crc kubenswrapper[4735]: I1008 14:19:17.647559 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:18 crc kubenswrapper[4735]: I1008 14:19:18.070830 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dd648"] Oct 08 14:19:19 crc kubenswrapper[4735]: I1008 14:19:19.644550 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dd648" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="registry-server" containerID="cri-o://67c2d570ac6ebec2b661978f95adce2cc07ee917e205f7a370f27c39616e4dc6" gracePeriod=2 Oct 08 14:19:20 crc kubenswrapper[4735]: I1008 14:19:20.026238 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:20 crc kubenswrapper[4735]: I1008 14:19:20.026284 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:20 crc kubenswrapper[4735]: I1008 14:19:20.070508 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:20 crc kubenswrapper[4735]: I1008 14:19:20.655576 4735 generic.go:334] "Generic (PLEG): container finished" podID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerID="67c2d570ac6ebec2b661978f95adce2cc07ee917e205f7a370f27c39616e4dc6" exitCode=0 Oct 08 14:19:20 crc kubenswrapper[4735]: I1008 14:19:20.655653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerDied","Data":"67c2d570ac6ebec2b661978f95adce2cc07ee917e205f7a370f27c39616e4dc6"} Oct 08 14:19:20 crc kubenswrapper[4735]: I1008 14:19:20.704540 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.099367 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.273836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-catalog-content\") pod \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.273912 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gssv2\" (UniqueName: \"kubernetes.io/projected/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-kube-api-access-gssv2\") pod \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.273984 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-utilities\") pod \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\" (UID: \"bd6f758f-0c2e-422f-b1b8-1cc475a1036a\") " Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.275173 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-utilities" (OuterVolumeSpecName: "utilities") pod "bd6f758f-0c2e-422f-b1b8-1cc475a1036a" (UID: "bd6f758f-0c2e-422f-b1b8-1cc475a1036a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.282206 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-kube-api-access-gssv2" (OuterVolumeSpecName: "kube-api-access-gssv2") pod "bd6f758f-0c2e-422f-b1b8-1cc475a1036a" (UID: "bd6f758f-0c2e-422f-b1b8-1cc475a1036a"). InnerVolumeSpecName "kube-api-access-gssv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.375973 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gssv2\" (UniqueName: \"kubernetes.io/projected/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-kube-api-access-gssv2\") on node \"crc\" DevicePath \"\"" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.376013 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.632395 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd6f758f-0c2e-422f-b1b8-1cc475a1036a" (UID: "bd6f758f-0c2e-422f-b1b8-1cc475a1036a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.665976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd648" event={"ID":"bd6f758f-0c2e-422f-b1b8-1cc475a1036a","Type":"ContainerDied","Data":"c9dff9d5293298cbd6b85b9038eda4196e6255e03003d7ec72bce84819a67b5e"} Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.666064 4735 scope.go:117] "RemoveContainer" containerID="67c2d570ac6ebec2b661978f95adce2cc07ee917e205f7a370f27c39616e4dc6" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.666115 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd648" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.680729 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd6f758f-0c2e-422f-b1b8-1cc475a1036a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.689521 4735 scope.go:117] "RemoveContainer" containerID="adb5a63d2007295a7c9e12c784f633a900072f687f3cfb5f916e2d7062bbfefc" Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.702930 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dd648"] Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.710620 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dd648"] Oct 08 14:19:21 crc kubenswrapper[4735]: I1008 14:19:21.717097 4735 scope.go:117] "RemoveContainer" containerID="dec7bd7c26634d62283924fc8911902aacf8ecca9224472e2762a34a3bb43f41" Oct 08 14:19:22 crc kubenswrapper[4735]: I1008 14:19:22.073402 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nshkv"] Oct 08 14:19:22 crc kubenswrapper[4735]: I1008 14:19:22.674833 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nshkv" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="registry-server" containerID="cri-o://11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304" gracePeriod=2 Oct 08 14:19:22 crc kubenswrapper[4735]: I1008 14:19:22.907874 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" path="/var/lib/kubelet/pods/bd6f758f-0c2e-422f-b1b8-1cc475a1036a/volumes" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.089468 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.201464 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh552\" (UniqueName: \"kubernetes.io/projected/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-kube-api-access-mh552\") pod \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.201646 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-utilities\") pod \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.201842 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-catalog-content\") pod \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\" (UID: \"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3\") " Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.202558 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-utilities" (OuterVolumeSpecName: "utilities") pod "bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" (UID: "bcff45d5-ae76-407f-94e8-97a8cc5d5ca3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.209694 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-kube-api-access-mh552" (OuterVolumeSpecName: "kube-api-access-mh552") pod "bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" (UID: "bcff45d5-ae76-407f-94e8-97a8cc5d5ca3"). InnerVolumeSpecName "kube-api-access-mh552". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.289113 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" (UID: "bcff45d5-ae76-407f-94e8-97a8cc5d5ca3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.303214 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh552\" (UniqueName: \"kubernetes.io/projected/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-kube-api-access-mh552\") on node \"crc\" DevicePath \"\"" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.303257 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.303270 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.686555 4735 generic.go:334] "Generic (PLEG): container finished" podID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerID="11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304" exitCode=0 Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.686618 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nshkv" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.686623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerDied","Data":"11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304"} Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.686668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nshkv" event={"ID":"bcff45d5-ae76-407f-94e8-97a8cc5d5ca3","Type":"ContainerDied","Data":"f5092a7e51debe40d7495d6d3c612a8da2463ac3f53ce51a2a75c0c6d4409483"} Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.686696 4735 scope.go:117] "RemoveContainer" containerID="11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.709145 4735 scope.go:117] "RemoveContainer" containerID="760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.723732 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nshkv"] Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.732023 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nshkv"] Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.745669 4735 scope.go:117] "RemoveContainer" containerID="93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.766364 4735 scope.go:117] "RemoveContainer" containerID="11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304" Oct 08 14:19:23 crc kubenswrapper[4735]: E1008 14:19:23.766859 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304\": container with ID starting with 11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304 not found: ID does not exist" containerID="11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.766887 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304"} err="failed to get container status \"11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304\": rpc error: code = NotFound desc = could not find container \"11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304\": container with ID starting with 11f30301cb1391bd8ccfd788017de0fa8b665b8cd71775fa6c81c907424c0304 not found: ID does not exist" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.766906 4735 scope.go:117] "RemoveContainer" containerID="760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d" Oct 08 14:19:23 crc kubenswrapper[4735]: E1008 14:19:23.767318 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d\": container with ID starting with 760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d not found: ID does not exist" containerID="760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.767369 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d"} err="failed to get container status \"760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d\": rpc error: code = NotFound desc = could not find container \"760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d\": container with ID starting with 760bb692240872d2ff560cd1ba7d7c846451a444a795b522b5286870b3053a9d not found: ID does not exist" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.767400 4735 scope.go:117] "RemoveContainer" containerID="93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6" Oct 08 14:19:23 crc kubenswrapper[4735]: E1008 14:19:23.768058 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6\": container with ID starting with 93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6 not found: ID does not exist" containerID="93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6" Oct 08 14:19:23 crc kubenswrapper[4735]: I1008 14:19:23.768114 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6"} err="failed to get container status \"93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6\": rpc error: code = NotFound desc = could not find container \"93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6\": container with ID starting with 93047d5fd78ec4ea587aeea205519fc32e4e7895a6da8dbf9eeebb8b5cede6e6 not found: ID does not exist" Oct 08 14:19:24 crc kubenswrapper[4735]: I1008 14:19:24.878506 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:19:24 crc kubenswrapper[4735]: I1008 14:19:24.878760 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:19:24 crc kubenswrapper[4735]: I1008 14:19:24.904967 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" path="/var/lib/kubelet/pods/bcff45d5-ae76-407f-94e8-97a8cc5d5ca3/volumes" Oct 08 14:19:54 crc kubenswrapper[4735]: I1008 14:19:54.878671 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:19:54 crc kubenswrapper[4735]: I1008 14:19:54.879195 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:20:24 crc kubenswrapper[4735]: I1008 14:20:24.878501 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:20:24 crc kubenswrapper[4735]: I1008 14:20:24.878913 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:20:24 crc kubenswrapper[4735]: I1008 14:20:24.878956 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:20:24 crc kubenswrapper[4735]: I1008 14:20:24.879528 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cec388ee120153babb4ba1df9c16c317b59b98c70ffddd569ed770c55e343565"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:20:24 crc kubenswrapper[4735]: I1008 14:20:24.879584 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://cec388ee120153babb4ba1df9c16c317b59b98c70ffddd569ed770c55e343565" gracePeriod=600 Oct 08 14:20:25 crc kubenswrapper[4735]: I1008 14:20:25.198792 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="cec388ee120153babb4ba1df9c16c317b59b98c70ffddd569ed770c55e343565" exitCode=0 Oct 08 14:20:25 crc kubenswrapper[4735]: I1008 14:20:25.198868 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"cec388ee120153babb4ba1df9c16c317b59b98c70ffddd569ed770c55e343565"} Oct 08 14:20:25 crc kubenswrapper[4735]: I1008 14:20:25.199153 4735 scope.go:117] "RemoveContainer" containerID="7ff2ebd06cea5cf44fc587b3ec8c967642921329412c26c223914219d4254eaf" Oct 08 14:20:26 crc kubenswrapper[4735]: I1008 14:20:26.210727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28"} Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.266365 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 08 14:20:57 crc kubenswrapper[4735]: E1008 14:20:57.268970 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="extract-content" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.268996 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="extract-content" Oct 08 14:20:57 crc kubenswrapper[4735]: E1008 14:20:57.269012 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="registry-server" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269025 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="registry-server" Oct 08 14:20:57 crc kubenswrapper[4735]: E1008 14:20:57.269048 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="extract-content" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269059 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="extract-content" Oct 08 14:20:57 crc kubenswrapper[4735]: E1008 14:20:57.269079 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="extract-utilities" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269090 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="extract-utilities" Oct 08 14:20:57 crc kubenswrapper[4735]: E1008 14:20:57.269109 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="registry-server" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269119 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="registry-server" Oct 08 14:20:57 crc kubenswrapper[4735]: E1008 14:20:57.269148 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="extract-utilities" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269159 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="extract-utilities" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269393 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcff45d5-ae76-407f-94e8-97a8cc5d5ca3" containerName="registry-server" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.269437 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6f758f-0c2e-422f-b1b8-1cc475a1036a" containerName="registry-server" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.270243 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.278182 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ss8tp" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.278390 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.380085 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85550486-2feb-441e-b941-8016761495c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85550486-2feb-441e-b941-8016761495c6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") " pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.380168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wv6\" (UniqueName: \"kubernetes.io/projected/3c84d6ea-2c33-4074-b75c-eb824ba23e0e-kube-api-access-65wv6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") " pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.481267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wv6\" (UniqueName: \"kubernetes.io/projected/3c84d6ea-2c33-4074-b75c-eb824ba23e0e-kube-api-access-65wv6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") " pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.481499 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85550486-2feb-441e-b941-8016761495c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85550486-2feb-441e-b941-8016761495c6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") " pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.484206 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.484255 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85550486-2feb-441e-b941-8016761495c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85550486-2feb-441e-b941-8016761495c6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/46bce4e4681ca2ae2fc75cea775a4006af07f1cce0c890e8a7706593aaa5b56b/globalmount\"" pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.508800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wv6\" (UniqueName: \"kubernetes.io/projected/3c84d6ea-2c33-4074-b75c-eb824ba23e0e-kube-api-access-65wv6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") " pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.525587 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85550486-2feb-441e-b941-8016761495c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85550486-2feb-441e-b941-8016761495c6\") pod \"mariadb-copy-data\" (UID: \"3c84d6ea-2c33-4074-b75c-eb824ba23e0e\") " pod="openstack/mariadb-copy-data" Oct 08 14:20:57 crc kubenswrapper[4735]: I1008 14:20:57.591736 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 08 14:20:58 crc kubenswrapper[4735]: I1008 14:20:58.094760 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 08 14:20:58 crc kubenswrapper[4735]: I1008 14:20:58.490343 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"3c84d6ea-2c33-4074-b75c-eb824ba23e0e","Type":"ContainerStarted","Data":"eb11f3a9c733702ab56bb5ba312fd6fa52258682046a56250603847f4a307355"} Oct 08 14:20:58 crc kubenswrapper[4735]: I1008 14:20:58.490393 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"3c84d6ea-2c33-4074-b75c-eb824ba23e0e","Type":"ContainerStarted","Data":"362d94cda8274dd9092e96e52925a5194cd1e92487cc9f028b69f8e992226018"} Oct 08 14:20:58 crc kubenswrapper[4735]: I1008 14:20:58.509183 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.5091582 podStartE2EDuration="2.5091582s" podCreationTimestamp="2025-10-08 14:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:20:58.504918262 +0000 UTC m=+5108.313222796" watchObservedRunningTime="2025-10-08 14:20:58.5091582 +0000 UTC m=+5108.317462764" Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.436448 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.439322 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.444820 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.531319 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6d2l\" (UniqueName: \"kubernetes.io/projected/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa-kube-api-access-h6d2l\") pod \"mariadb-client\" (UID: \"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa\") " pod="openstack/mariadb-client" Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.633465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6d2l\" (UniqueName: \"kubernetes.io/projected/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa-kube-api-access-h6d2l\") pod \"mariadb-client\" (UID: \"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa\") " pod="openstack/mariadb-client" Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.653708 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6d2l\" (UniqueName: \"kubernetes.io/projected/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa-kube-api-access-h6d2l\") pod \"mariadb-client\" (UID: \"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa\") " pod="openstack/mariadb-client" Oct 08 14:21:00 crc kubenswrapper[4735]: I1008 14:21:00.764173 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:01 crc kubenswrapper[4735]: I1008 14:21:01.210953 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:01 crc kubenswrapper[4735]: I1008 14:21:01.510655 4735 generic.go:334] "Generic (PLEG): container finished" podID="6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" containerID="c2c9e03cf806c3171d1a46ab435a0a11b10fadadf815b5db33a061d6fa30f581" exitCode=0 Oct 08 14:21:01 crc kubenswrapper[4735]: I1008 14:21:01.510716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa","Type":"ContainerDied","Data":"c2c9e03cf806c3171d1a46ab435a0a11b10fadadf815b5db33a061d6fa30f581"} Oct 08 14:21:01 crc kubenswrapper[4735]: I1008 14:21:01.510768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa","Type":"ContainerStarted","Data":"8170b91cfb258d257363cf9d14a6ff84de3067a111be84c119a4587443266cf4"} Oct 08 14:21:02 crc kubenswrapper[4735]: I1008 14:21:02.941652 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:02 crc kubenswrapper[4735]: I1008 14:21:02.968078 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa/mariadb-client/0.log" Oct 08 14:21:02 crc kubenswrapper[4735]: I1008 14:21:02.997559 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.004077 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.071961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6d2l\" (UniqueName: \"kubernetes.io/projected/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa-kube-api-access-h6d2l\") pod \"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa\" (UID: \"6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa\") " Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.080990 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa-kube-api-access-h6d2l" (OuterVolumeSpecName: "kube-api-access-h6d2l") pod "6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" (UID: "6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa"). InnerVolumeSpecName "kube-api-access-h6d2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.133372 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:03 crc kubenswrapper[4735]: E1008 14:21:03.133894 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" containerName="mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.133919 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" containerName="mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.134135 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" containerName="mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.134931 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.138983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.173269 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6d2l\" (UniqueName: \"kubernetes.io/projected/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa-kube-api-access-h6d2l\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.274963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgdck\" (UniqueName: \"kubernetes.io/projected/b3c7e0d0-127d-4250-a1da-28fdb8b7015d-kube-api-access-kgdck\") pod \"mariadb-client\" (UID: \"b3c7e0d0-127d-4250-a1da-28fdb8b7015d\") " pod="openstack/mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.376869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgdck\" (UniqueName: \"kubernetes.io/projected/b3c7e0d0-127d-4250-a1da-28fdb8b7015d-kube-api-access-kgdck\") pod \"mariadb-client\" (UID: \"b3c7e0d0-127d-4250-a1da-28fdb8b7015d\") " pod="openstack/mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.394078 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgdck\" (UniqueName: \"kubernetes.io/projected/b3c7e0d0-127d-4250-a1da-28fdb8b7015d-kube-api-access-kgdck\") pod \"mariadb-client\" (UID: \"b3c7e0d0-127d-4250-a1da-28fdb8b7015d\") " pod="openstack/mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.450741 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.538579 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8170b91cfb258d257363cf9d14a6ff84de3067a111be84c119a4587443266cf4" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.538648 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.560588 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" podUID="b3c7e0d0-127d-4250-a1da-28fdb8b7015d" Oct 08 14:21:03 crc kubenswrapper[4735]: I1008 14:21:03.888236 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:04 crc kubenswrapper[4735]: I1008 14:21:04.549787 4735 generic.go:334] "Generic (PLEG): container finished" podID="b3c7e0d0-127d-4250-a1da-28fdb8b7015d" containerID="26a132e417e043b5b76ae625600e75cad8c71093239df4d949bd0193c5a0185b" exitCode=0 Oct 08 14:21:04 crc kubenswrapper[4735]: I1008 14:21:04.549915 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b3c7e0d0-127d-4250-a1da-28fdb8b7015d","Type":"ContainerDied","Data":"26a132e417e043b5b76ae625600e75cad8c71093239df4d949bd0193c5a0185b"} Oct 08 14:21:04 crc kubenswrapper[4735]: I1008 14:21:04.550140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b3c7e0d0-127d-4250-a1da-28fdb8b7015d","Type":"ContainerStarted","Data":"85fec3f9f1e0d7661bb72f799509eaac30aa981c1c24de46f20fc6d0389081c9"} Oct 08 14:21:04 crc kubenswrapper[4735]: I1008 14:21:04.908182 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa" path="/var/lib/kubelet/pods/6bdcbf0a-e0f2-493a-b1cf-4cbbf31e20fa/volumes" Oct 08 14:21:05 crc kubenswrapper[4735]: I1008 14:21:05.865915 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:05 crc kubenswrapper[4735]: I1008 14:21:05.885388 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_b3c7e0d0-127d-4250-a1da-28fdb8b7015d/mariadb-client/0.log" Oct 08 14:21:05 crc kubenswrapper[4735]: I1008 14:21:05.915875 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:05 crc kubenswrapper[4735]: I1008 14:21:05.921558 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 08 14:21:06 crc kubenswrapper[4735]: I1008 14:21:06.014750 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgdck\" (UniqueName: \"kubernetes.io/projected/b3c7e0d0-127d-4250-a1da-28fdb8b7015d-kube-api-access-kgdck\") pod \"b3c7e0d0-127d-4250-a1da-28fdb8b7015d\" (UID: \"b3c7e0d0-127d-4250-a1da-28fdb8b7015d\") " Oct 08 14:21:06 crc kubenswrapper[4735]: I1008 14:21:06.021326 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c7e0d0-127d-4250-a1da-28fdb8b7015d-kube-api-access-kgdck" (OuterVolumeSpecName: "kube-api-access-kgdck") pod "b3c7e0d0-127d-4250-a1da-28fdb8b7015d" (UID: "b3c7e0d0-127d-4250-a1da-28fdb8b7015d"). InnerVolumeSpecName "kube-api-access-kgdck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:21:06 crc kubenswrapper[4735]: I1008 14:21:06.116344 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgdck\" (UniqueName: \"kubernetes.io/projected/b3c7e0d0-127d-4250-a1da-28fdb8b7015d-kube-api-access-kgdck\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:06 crc kubenswrapper[4735]: I1008 14:21:06.567451 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85fec3f9f1e0d7661bb72f799509eaac30aa981c1c24de46f20fc6d0389081c9" Oct 08 14:21:06 crc kubenswrapper[4735]: I1008 14:21:06.567631 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 14:21:06 crc kubenswrapper[4735]: I1008 14:21:06.904119 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c7e0d0-127d-4250-a1da-28fdb8b7015d" path="/var/lib/kubelet/pods/b3c7e0d0-127d-4250-a1da-28fdb8b7015d/volumes" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.378748 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 14:21:41 crc kubenswrapper[4735]: E1008 14:21:41.379764 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c7e0d0-127d-4250-a1da-28fdb8b7015d" containerName="mariadb-client" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.379781 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c7e0d0-127d-4250-a1da-28fdb8b7015d" containerName="mariadb-client" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.379962 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3c7e0d0-127d-4250-a1da-28fdb8b7015d" containerName="mariadb-client" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.380765 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.383130 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.383230 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.385337 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5drf8" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.398516 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.406391 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.408367 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.412108 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.413352 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.421234 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.440392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb53e0-bc62-4285-9aa3-c5e5958255bc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525360 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7f8e7e7-4056-498c-b7c4-cedd842f264f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525405 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1792695-0298-4cc1-868e-4a3c2e7ca957-config\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525435 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1792695-0298-4cc1-868e-4a3c2e7ca957-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7f8e7e7-4056-498c-b7c4-cedd842f264f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525568 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb53e0-bc62-4285-9aa3-c5e5958255bc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1792695-0298-4cc1-868e-4a3c2e7ca957-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525628 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfj2f\" (UniqueName: \"kubernetes.io/projected/f7f8e7e7-4056-498c-b7c4-cedd842f264f-kube-api-access-zfj2f\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525653 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dsld\" (UniqueName: \"kubernetes.io/projected/d1792695-0298-4cc1-868e-4a3c2e7ca957-kube-api-access-6dsld\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1792695-0298-4cc1-868e-4a3c2e7ca957-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525710 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f8e7e7-4056-498c-b7c4-cedd842f264f-config\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plb2q\" (UniqueName: \"kubernetes.io/projected/48cb53e0-bc62-4285-9aa3-c5e5958255bc-kube-api-access-plb2q\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.525959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48cb53e0-bc62-4285-9aa3-c5e5958255bc-config\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.526183 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48cb53e0-bc62-4285-9aa3-c5e5958255bc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.526236 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.526269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f8e7e7-4056-498c-b7c4-cedd842f264f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.586524 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.590557 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.592593 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.592980 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.593106 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9k8sj" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.600089 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.604576 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.605937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.620159 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.622825 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.623134 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.627537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfj2f\" (UniqueName: \"kubernetes.io/projected/f7f8e7e7-4056-498c-b7c4-cedd842f264f-kube-api-access-zfj2f\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.627735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dsld\" (UniqueName: \"kubernetes.io/projected/d1792695-0298-4cc1-868e-4a3c2e7ca957-kube-api-access-6dsld\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628018 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f8e7e7-4056-498c-b7c4-cedd842f264f-config\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628090 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1792695-0298-4cc1-868e-4a3c2e7ca957-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plb2q\" (UniqueName: \"kubernetes.io/projected/48cb53e0-bc62-4285-9aa3-c5e5958255bc-kube-api-access-plb2q\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628344 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48cb53e0-bc62-4285-9aa3-c5e5958255bc-config\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48cb53e0-bc62-4285-9aa3-c5e5958255bc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f8e7e7-4056-498c-b7c4-cedd842f264f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628684 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb53e0-bc62-4285-9aa3-c5e5958255bc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7f8e7e7-4056-498c-b7c4-cedd842f264f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.628960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1792695-0298-4cc1-868e-4a3c2e7ca957-config\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1792695-0298-4cc1-868e-4a3c2e7ca957-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7f8e7e7-4056-498c-b7c4-cedd842f264f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629239 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629337 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb53e0-bc62-4285-9aa3-c5e5958255bc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629414 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1792695-0298-4cc1-868e-4a3c2e7ca957-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629719 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1792695-0298-4cc1-868e-4a3c2e7ca957-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.630412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1792695-0298-4cc1-868e-4a3c2e7ca957-config\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.630616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1792695-0298-4cc1-868e-4a3c2e7ca957-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.631242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7f8e7e7-4056-498c-b7c4-cedd842f264f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.630434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48cb53e0-bc62-4285-9aa3-c5e5958255bc-config\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f8e7e7-4056-498c-b7c4-cedd842f264f-config\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.629254 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.631724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48cb53e0-bc62-4285-9aa3-c5e5958255bc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.632704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48cb53e0-bc62-4285-9aa3-c5e5958255bc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.633180 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7f8e7e7-4056-498c-b7c4-cedd842f264f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.635810 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.635846 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0146cd0da87f330b8afb85d107c5042f7b022830654dca81112e54df572c4b9e/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.636045 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.636122 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0127dc9640e0be8e088bc49b30c5f6f65139e3d56d1c4321404914c38036ec67/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.636254 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.636340 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9d8fd264113175234ad8600abb71cb3af7cffe3a8c2ae4b80842183a3c9d867a/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.637113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f8e7e7-4056-498c-b7c4-cedd842f264f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.637557 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cb53e0-bc62-4285-9aa3-c5e5958255bc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.638613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1792695-0298-4cc1-868e-4a3c2e7ca957-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.659406 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plb2q\" (UniqueName: \"kubernetes.io/projected/48cb53e0-bc62-4285-9aa3-c5e5958255bc-kube-api-access-plb2q\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.659814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dsld\" (UniqueName: \"kubernetes.io/projected/d1792695-0298-4cc1-868e-4a3c2e7ca957-kube-api-access-6dsld\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.663497 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfj2f\" (UniqueName: \"kubernetes.io/projected/f7f8e7e7-4056-498c-b7c4-cedd842f264f-kube-api-access-zfj2f\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.671226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f606aa0-8a96-46b9-968c-221d0a5ef1a2\") pod \"ovsdbserver-nb-0\" (UID: \"48cb53e0-bc62-4285-9aa3-c5e5958255bc\") " pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.674915 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bf10f17-08ba-4b91-a9ac-fd0188633b3b\") pod \"ovsdbserver-nb-2\" (UID: \"d1792695-0298-4cc1-868e-4a3c2e7ca957\") " pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.678615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b727d4bb-055d-4bf5-b4dd-c1275062dd72\") pod \"ovsdbserver-nb-1\" (UID: \"f7f8e7e7-4056-498c-b7c4-cedd842f264f\") " pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.700185 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.730560 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731064 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731114 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-config\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731134 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-config\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731286 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jkg\" (UniqueName: \"kubernetes.io/projected/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-kube-api-access-42jkg\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731672 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731871 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.731995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5950b536-dede-4557-886d-faab12fe722f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5950b536-dede-4557-886d-faab12fe722f\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.732035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-config\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.732061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8zz\" (UniqueName: \"kubernetes.io/projected/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-kube-api-access-6j8zz\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.732109 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.732138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc76g\" (UniqueName: \"kubernetes.io/projected/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-kube-api-access-rc76g\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.732183 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.743777 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-config\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-config\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833649 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42jkg\" (UniqueName: \"kubernetes.io/projected/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-kube-api-access-42jkg\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833774 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833797 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833833 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5950b536-dede-4557-886d-faab12fe722f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5950b536-dede-4557-886d-faab12fe722f\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833907 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-config\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833926 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8zz\" (UniqueName: \"kubernetes.io/projected/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-kube-api-access-6j8zz\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.834651 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.835330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-config\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.835522 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-config\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.835590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.836042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.836135 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.836389 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.833965 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.838620 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc76g\" (UniqueName: \"kubernetes.io/projected/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-kube-api-access-rc76g\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.838686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.839322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-config\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.840081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.840098 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.840149 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bc43e2019d49f5f40858315cf09f5a8c213ac3660dcbb17c907f556677e7312f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.840371 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.840405 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5950b536-dede-4557-886d-faab12fe722f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5950b536-dede-4557-886d-faab12fe722f\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/efd000e5e83a9c0c0074c379a91800d8c7fd1f4f5e62b91047ce04a03aa8ae78/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.840604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.842714 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.842750 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/edbf9016ed869ed16b9973abbb619cd16d0635d3ce87095375d2f773a7653f22/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.847269 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.850675 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.853945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8zz\" (UniqueName: \"kubernetes.io/projected/d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe-kube-api-access-6j8zz\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.860617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jkg\" (UniqueName: \"kubernetes.io/projected/3ebbd73a-21ad-4baa-8fec-68e4ae05591b-kube-api-access-42jkg\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.864279 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc76g\" (UniqueName: \"kubernetes.io/projected/9b3c1ba5-f809-4ccb-b29c-5bb76169819f-kube-api-access-rc76g\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.885957 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5950b536-dede-4557-886d-faab12fe722f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5950b536-dede-4557-886d-faab12fe722f\") pod \"ovsdbserver-sb-2\" (UID: \"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe\") " pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.894902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e407434-85b0-4f2f-a361-ece49ebcd0cf\") pod \"ovsdbserver-sb-0\" (UID: \"9b3c1ba5-f809-4ccb-b29c-5bb76169819f\") " pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.898614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23ce3c87-f0ff-4fe0-8d9b-eff9951ce4ff\") pod \"ovsdbserver-sb-1\" (UID: \"3ebbd73a-21ad-4baa-8fec-68e4ae05591b\") " pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:41 crc kubenswrapper[4735]: I1008 14:21:41.922875 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.018344 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.119022 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.254199 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.338421 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 08 14:21:42 crc kubenswrapper[4735]: W1008 14:21:42.355441 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1792695_0298_4cc1_868e_4a3c2e7ca957.slice/crio-7ea86fb59ae35de7a7a2e3ba80ad77ef88500c9a7a16902676ab411e1512453d WatchSource:0}: Error finding container 7ea86fb59ae35de7a7a2e3ba80ad77ef88500c9a7a16902676ab411e1512453d: Status 404 returned error can't find the container with id 7ea86fb59ae35de7a7a2e3ba80ad77ef88500c9a7a16902676ab411e1512453d Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.436374 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.541442 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.692742 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 08 14:21:42 crc kubenswrapper[4735]: W1008 14:21:42.708275 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ebbd73a_21ad_4baa_8fec_68e4ae05591b.slice/crio-f96ba24172e3ccdcfc3cd59271c8c3be438991f11d59eb91b121eac4ec03c4d0 WatchSource:0}: Error finding container f96ba24172e3ccdcfc3cd59271c8c3be438991f11d59eb91b121eac4ec03c4d0: Status 404 returned error can't find the container with id f96ba24172e3ccdcfc3cd59271c8c3be438991f11d59eb91b121eac4ec03c4d0 Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915860 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe","Type":"ContainerStarted","Data":"0ec76a12497870d66a18313c60a860f91f269e87e9a0f11d681f567b7a84ef9c"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"3ebbd73a-21ad-4baa-8fec-68e4ae05591b","Type":"ContainerStarted","Data":"f96ba24172e3ccdcfc3cd59271c8c3be438991f11d59eb91b121eac4ec03c4d0"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9b3c1ba5-f809-4ccb-b29c-5bb76169819f","Type":"ContainerStarted","Data":"4e3b0521bad839922b79518ef053c3e92dd06fc61a7c79108d89f75ed64426b1"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9b3c1ba5-f809-4ccb-b29c-5bb76169819f","Type":"ContainerStarted","Data":"5bbac7a881c1cd92d51af6d50d7f55589d9670e0d331ced81b2c813f31a931c8"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"48cb53e0-bc62-4285-9aa3-c5e5958255bc","Type":"ContainerStarted","Data":"67635f7b1b49e355d945bac3596f936cd4b6e08970801ea0271faf7849eb3b14"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"48cb53e0-bc62-4285-9aa3-c5e5958255bc","Type":"ContainerStarted","Data":"98281f39b087f282ce0a209a76e85bde241cc93af91ef4df887ecd3ce053107f"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.915962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"48cb53e0-bc62-4285-9aa3-c5e5958255bc","Type":"ContainerStarted","Data":"883fe68ae1ae2a692ee7cfbc74e1347793eabb7caf99a2251bfbd2ce73a3a548"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.918205 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d1792695-0298-4cc1-868e-4a3c2e7ca957","Type":"ContainerStarted","Data":"4b67928b0aca7c44a2735e0532f3e4dacebc21b1f79f7241aa881cc98845c2cf"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.918239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d1792695-0298-4cc1-868e-4a3c2e7ca957","Type":"ContainerStarted","Data":"0247aef16ae7105e41b24e0290a4490011db7b037ae0f98a016f49cba0a9e6c8"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.918252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d1792695-0298-4cc1-868e-4a3c2e7ca957","Type":"ContainerStarted","Data":"7ea86fb59ae35de7a7a2e3ba80ad77ef88500c9a7a16902676ab411e1512453d"} Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.949333 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=2.949312489 podStartE2EDuration="2.949312489s" podCreationTimestamp="2025-10-08 14:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:42.93960243 +0000 UTC m=+5152.747906984" watchObservedRunningTime="2025-10-08 14:21:42.949312489 +0000 UTC m=+5152.757617033" Oct 08 14:21:42 crc kubenswrapper[4735]: I1008 14:21:42.983788 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=2.983771651 podStartE2EDuration="2.983771651s" podCreationTimestamp="2025-10-08 14:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:42.974143105 +0000 UTC m=+5152.782447649" watchObservedRunningTime="2025-10-08 14:21:42.983771651 +0000 UTC m=+5152.792076195" Oct 08 14:21:43 crc kubenswrapper[4735]: W1008 14:21:43.089049 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7f8e7e7_4056_498c_b7c4_cedd842f264f.slice/crio-1fd26f8caf24ef6072a123352b3602de8d92008fcf1698f6323f1c0174eadf02 WatchSource:0}: Error finding container 1fd26f8caf24ef6072a123352b3602de8d92008fcf1698f6323f1c0174eadf02: Status 404 returned error can't find the container with id 1fd26f8caf24ef6072a123352b3602de8d92008fcf1698f6323f1c0174eadf02 Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.103970 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.931101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"3ebbd73a-21ad-4baa-8fec-68e4ae05591b","Type":"ContainerStarted","Data":"8f5871cdd314d5e92647ce997e2a07f93e5bff13845cb0455d6075fe99f80f68"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.931423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"3ebbd73a-21ad-4baa-8fec-68e4ae05591b","Type":"ContainerStarted","Data":"a839d7b0020950b8a8439256f1e92e880571e255492736e4525cd88bd286cb47"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.934540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9b3c1ba5-f809-4ccb-b29c-5bb76169819f","Type":"ContainerStarted","Data":"b97c25fb9f2af1b5a43ef03458a30f68210d104ff8bc4d2aa797da739a7c5b61"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.936595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f7f8e7e7-4056-498c-b7c4-cedd842f264f","Type":"ContainerStarted","Data":"bf4573e5b6266377c30afa02c02f15ded114a195a3005e71dba75609dfc6a979"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.936636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f7f8e7e7-4056-498c-b7c4-cedd842f264f","Type":"ContainerStarted","Data":"21c5ae5309f198c299d3867bb246ac9b3632430f9a6e0096ad4a85fb1a21f7a8"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.936648 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f7f8e7e7-4056-498c-b7c4-cedd842f264f","Type":"ContainerStarted","Data":"1fd26f8caf24ef6072a123352b3602de8d92008fcf1698f6323f1c0174eadf02"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.939422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe","Type":"ContainerStarted","Data":"98edad068724a26024a259a46b7e089c0fa3bc761c4846eab0bfffd33ae91d69"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.939487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe","Type":"ContainerStarted","Data":"b86d36f8f8f5b6f5b692087489e4b15bae6b984d192307685398df0c97977e36"} Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.957556 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.957539067 podStartE2EDuration="3.957539067s" podCreationTimestamp="2025-10-08 14:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:43.956606384 +0000 UTC m=+5153.764910948" watchObservedRunningTime="2025-10-08 14:21:43.957539067 +0000 UTC m=+5153.765843611" Oct 08 14:21:43 crc kubenswrapper[4735]: I1008 14:21:43.983143 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.983123233 podStartE2EDuration="3.983123233s" podCreationTimestamp="2025-10-08 14:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:43.97559376 +0000 UTC m=+5153.783898314" watchObservedRunningTime="2025-10-08 14:21:43.983123233 +0000 UTC m=+5153.791427777" Oct 08 14:21:44 crc kubenswrapper[4735]: I1008 14:21:44.006142 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.006123802 podStartE2EDuration="4.006123802s" podCreationTimestamp="2025-10-08 14:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:43.998727532 +0000 UTC m=+5153.807032076" watchObservedRunningTime="2025-10-08 14:21:44.006123802 +0000 UTC m=+5153.814428356" Oct 08 14:21:44 crc kubenswrapper[4735]: I1008 14:21:44.017021 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.01699688 podStartE2EDuration="4.01699688s" podCreationTimestamp="2025-10-08 14:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:44.011448248 +0000 UTC m=+5153.819752792" watchObservedRunningTime="2025-10-08 14:21:44.01699688 +0000 UTC m=+5153.825301424" Oct 08 14:21:44 crc kubenswrapper[4735]: I1008 14:21:44.700744 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:44 crc kubenswrapper[4735]: I1008 14:21:44.731633 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:44 crc kubenswrapper[4735]: I1008 14:21:44.744843 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:44 crc kubenswrapper[4735]: I1008 14:21:44.923967 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:45 crc kubenswrapper[4735]: I1008 14:21:45.020001 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:45 crc kubenswrapper[4735]: I1008 14:21:45.081355 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:45 crc kubenswrapper[4735]: I1008 14:21:45.120320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:45 crc kubenswrapper[4735]: I1008 14:21:45.158396 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:45 crc kubenswrapper[4735]: I1008 14:21:45.955004 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:45 crc kubenswrapper[4735]: I1008 14:21:45.955171 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:46 crc kubenswrapper[4735]: I1008 14:21:46.700826 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:46 crc kubenswrapper[4735]: I1008 14:21:46.762233 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:46 crc kubenswrapper[4735]: I1008 14:21:46.764504 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:46 crc kubenswrapper[4735]: I1008 14:21:46.923545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.057732 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.178284 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.316508 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-577ff9b9d9-trwjs"] Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.318051 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.328896 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-577ff9b9d9-trwjs"] Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.333755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.478159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9trds\" (UniqueName: \"kubernetes.io/projected/a2ac324b-a261-46b3-9764-0b95b4c6e932-kube-api-access-9trds\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.478450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-ovsdbserver-sb\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.478739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-config\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.478856 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-dns-svc\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.580708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-config\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.580762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-dns-svc\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.580825 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9trds\" (UniqueName: \"kubernetes.io/projected/a2ac324b-a261-46b3-9764-0b95b4c6e932-kube-api-access-9trds\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.580845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-ovsdbserver-sb\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.581824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-ovsdbserver-sb\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.581943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-dns-svc\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.581978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-config\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.600133 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9trds\" (UniqueName: \"kubernetes.io/projected/a2ac324b-a261-46b3-9764-0b95b4c6e932-kube-api-access-9trds\") pod \"dnsmasq-dns-577ff9b9d9-trwjs\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.639814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.741880 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.784958 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.792340 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.831074 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.906791 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 08 14:21:47 crc kubenswrapper[4735]: I1008 14:21:47.998733 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.056645 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-577ff9b9d9-trwjs"] Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.097916 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd7df4955-5hbtd"] Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.099364 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.105458 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.111118 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd7df4955-5hbtd"] Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.113089 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.126113 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-577ff9b9d9-trwjs"] Oct 08 14:21:48 crc kubenswrapper[4735]: W1008 14:21:48.164572 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ac324b_a261_46b3_9764_0b95b4c6e932.slice/crio-30294b72582f562708c5d9b8374bc5ea9d8a08d9ef4f5436bb4aa0bc93323af1 WatchSource:0}: Error finding container 30294b72582f562708c5d9b8374bc5ea9d8a08d9ef4f5436bb4aa0bc93323af1: Status 404 returned error can't find the container with id 30294b72582f562708c5d9b8374bc5ea9d8a08d9ef4f5436bb4aa0bc93323af1 Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.294549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvf8\" (UniqueName: \"kubernetes.io/projected/8da7ac61-6a48-4698-95b4-66fc7a6411b4-kube-api-access-2jvf8\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.294882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-dns-svc\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.294924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-config\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.294988 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-nb\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.295011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-sb\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.396455 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-dns-svc\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.396525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-config\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.396617 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-nb\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.396642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-sb\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.396731 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvf8\" (UniqueName: \"kubernetes.io/projected/8da7ac61-6a48-4698-95b4-66fc7a6411b4-kube-api-access-2jvf8\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.398297 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-sb\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.398310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-config\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.398921 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-nb\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.399417 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-dns-svc\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.412458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvf8\" (UniqueName: \"kubernetes.io/projected/8da7ac61-6a48-4698-95b4-66fc7a6411b4-kube-api-access-2jvf8\") pod \"dnsmasq-dns-fd7df4955-5hbtd\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.430464 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.840087 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd7df4955-5hbtd"] Oct 08 14:21:48 crc kubenswrapper[4735]: W1008 14:21:48.846438 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8da7ac61_6a48_4698_95b4_66fc7a6411b4.slice/crio-3d8d11e6b2254c2f65bafdd7c67b7ab2ac466ce652e070b8d2263b0c14a4771f WatchSource:0}: Error finding container 3d8d11e6b2254c2f65bafdd7c67b7ab2ac466ce652e070b8d2263b0c14a4771f: Status 404 returned error can't find the container with id 3d8d11e6b2254c2f65bafdd7c67b7ab2ac466ce652e070b8d2263b0c14a4771f Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.977078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" event={"ID":"8da7ac61-6a48-4698-95b4-66fc7a6411b4","Type":"ContainerStarted","Data":"3d8d11e6b2254c2f65bafdd7c67b7ab2ac466ce652e070b8d2263b0c14a4771f"} Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.978687 4735 generic.go:334] "Generic (PLEG): container finished" podID="a2ac324b-a261-46b3-9764-0b95b4c6e932" containerID="f0e72119bac4c07e2c6abb1c37ce9d887b7eb1c85f89c6abc6f70c3e051b1a6b" exitCode=0 Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.978729 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" event={"ID":"a2ac324b-a261-46b3-9764-0b95b4c6e932","Type":"ContainerDied","Data":"f0e72119bac4c07e2c6abb1c37ce9d887b7eb1c85f89c6abc6f70c3e051b1a6b"} Oct 08 14:21:48 crc kubenswrapper[4735]: I1008 14:21:48.978770 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" event={"ID":"a2ac324b-a261-46b3-9764-0b95b4c6e932","Type":"ContainerStarted","Data":"30294b72582f562708c5d9b8374bc5ea9d8a08d9ef4f5436bb4aa0bc93323af1"} Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.278771 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.418577 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-ovsdbserver-sb\") pod \"a2ac324b-a261-46b3-9764-0b95b4c6e932\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.418637 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9trds\" (UniqueName: \"kubernetes.io/projected/a2ac324b-a261-46b3-9764-0b95b4c6e932-kube-api-access-9trds\") pod \"a2ac324b-a261-46b3-9764-0b95b4c6e932\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.418743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-config\") pod \"a2ac324b-a261-46b3-9764-0b95b4c6e932\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.418832 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-dns-svc\") pod \"a2ac324b-a261-46b3-9764-0b95b4c6e932\" (UID: \"a2ac324b-a261-46b3-9764-0b95b4c6e932\") " Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.422738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ac324b-a261-46b3-9764-0b95b4c6e932-kube-api-access-9trds" (OuterVolumeSpecName: "kube-api-access-9trds") pod "a2ac324b-a261-46b3-9764-0b95b4c6e932" (UID: "a2ac324b-a261-46b3-9764-0b95b4c6e932"). InnerVolumeSpecName "kube-api-access-9trds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.436653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2ac324b-a261-46b3-9764-0b95b4c6e932" (UID: "a2ac324b-a261-46b3-9764-0b95b4c6e932"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.438428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2ac324b-a261-46b3-9764-0b95b4c6e932" (UID: "a2ac324b-a261-46b3-9764-0b95b4c6e932"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.440871 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-config" (OuterVolumeSpecName: "config") pod "a2ac324b-a261-46b3-9764-0b95b4c6e932" (UID: "a2ac324b-a261-46b3-9764-0b95b4c6e932"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.520570 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.520614 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.520634 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9trds\" (UniqueName: \"kubernetes.io/projected/a2ac324b-a261-46b3-9764-0b95b4c6e932-kube-api-access-9trds\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.520651 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2ac324b-a261-46b3-9764-0b95b4c6e932-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.987764 4735 generic.go:334] "Generic (PLEG): container finished" podID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerID="990f4ba5ec3cd0d52f874f584429717e482226978afcf32c05f857bdd0b7be2e" exitCode=0 Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.987856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" event={"ID":"8da7ac61-6a48-4698-95b4-66fc7a6411b4","Type":"ContainerDied","Data":"990f4ba5ec3cd0d52f874f584429717e482226978afcf32c05f857bdd0b7be2e"} Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.989880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" event={"ID":"a2ac324b-a261-46b3-9764-0b95b4c6e932","Type":"ContainerDied","Data":"30294b72582f562708c5d9b8374bc5ea9d8a08d9ef4f5436bb4aa0bc93323af1"} Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.989937 4735 scope.go:117] "RemoveContainer" containerID="f0e72119bac4c07e2c6abb1c37ce9d887b7eb1c85f89c6abc6f70c3e051b1a6b" Oct 08 14:21:49 crc kubenswrapper[4735]: I1008 14:21:49.990005 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-577ff9b9d9-trwjs" Oct 08 14:21:50 crc kubenswrapper[4735]: I1008 14:21:50.200606 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-577ff9b9d9-trwjs"] Oct 08 14:21:50 crc kubenswrapper[4735]: I1008 14:21:50.208948 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-577ff9b9d9-trwjs"] Oct 08 14:21:50 crc kubenswrapper[4735]: I1008 14:21:50.920540 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ac324b-a261-46b3-9764-0b95b4c6e932" path="/var/lib/kubelet/pods/a2ac324b-a261-46b3-9764-0b95b4c6e932/volumes" Oct 08 14:21:51 crc kubenswrapper[4735]: I1008 14:21:51.002280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" event={"ID":"8da7ac61-6a48-4698-95b4-66fc7a6411b4","Type":"ContainerStarted","Data":"9bb175713c4d9d6b792e2733a9fecfe2c2b546792b6b14bd9709fbd579ff9706"} Oct 08 14:21:51 crc kubenswrapper[4735]: I1008 14:21:51.003306 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:51 crc kubenswrapper[4735]: I1008 14:21:51.028633 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" podStartSLOduration=3.02861401 podStartE2EDuration="3.02861401s" podCreationTimestamp="2025-10-08 14:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:21:51.026968898 +0000 UTC m=+5160.835273462" watchObservedRunningTime="2025-10-08 14:21:51.02861401 +0000 UTC m=+5160.836918554" Oct 08 14:21:51 crc kubenswrapper[4735]: I1008 14:21:51.824800 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.796799 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 08 14:21:54 crc kubenswrapper[4735]: E1008 14:21:54.797555 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ac324b-a261-46b3-9764-0b95b4c6e932" containerName="init" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.797573 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ac324b-a261-46b3-9764-0b95b4c6e932" containerName="init" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.797808 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ac324b-a261-46b3-9764-0b95b4c6e932" containerName="init" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.798528 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.800760 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.808447 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.972972 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.973082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/370761ea-f80e-4565-ac02-3c652949e54d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:54 crc kubenswrapper[4735]: I1008 14:21:54.973168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l65z5\" (UniqueName: \"kubernetes.io/projected/370761ea-f80e-4565-ac02-3c652949e54d-kube-api-access-l65z5\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.075625 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l65z5\" (UniqueName: \"kubernetes.io/projected/370761ea-f80e-4565-ac02-3c652949e54d-kube-api-access-l65z5\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.075945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.076064 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/370761ea-f80e-4565-ac02-3c652949e54d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.079957 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.080010 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/300e0d9aa997191a2a1a8594f61a0aac674c65901c640e8184d225bff994563f/globalmount\"" pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.083508 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/370761ea-f80e-4565-ac02-3c652949e54d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.098549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l65z5\" (UniqueName: \"kubernetes.io/projected/370761ea-f80e-4565-ac02-3c652949e54d-kube-api-access-l65z5\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.126068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-064f15aa-b8fb-4c46-8c33-821afe35d6b3\") pod \"ovn-copy-data\" (UID: \"370761ea-f80e-4565-ac02-3c652949e54d\") " pod="openstack/ovn-copy-data" Oct 08 14:21:55 crc kubenswrapper[4735]: I1008 14:21:55.423078 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 08 14:21:56 crc kubenswrapper[4735]: I1008 14:21:56.009436 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 08 14:21:56 crc kubenswrapper[4735]: W1008 14:21:56.014951 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod370761ea_f80e_4565_ac02_3c652949e54d.slice/crio-cf3227e7e7d3fade7002c154bc1794ef79ca9b74bd45c5df445d19caa349b988 WatchSource:0}: Error finding container cf3227e7e7d3fade7002c154bc1794ef79ca9b74bd45c5df445d19caa349b988: Status 404 returned error can't find the container with id cf3227e7e7d3fade7002c154bc1794ef79ca9b74bd45c5df445d19caa349b988 Oct 08 14:21:56 crc kubenswrapper[4735]: I1008 14:21:56.085338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"370761ea-f80e-4565-ac02-3c652949e54d","Type":"ContainerStarted","Data":"cf3227e7e7d3fade7002c154bc1794ef79ca9b74bd45c5df445d19caa349b988"} Oct 08 14:21:57 crc kubenswrapper[4735]: I1008 14:21:57.094235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"370761ea-f80e-4565-ac02-3c652949e54d","Type":"ContainerStarted","Data":"b06ea84b6c4e2ec0cfc9842e882db7fd85e496fa7de0791fd9b8510cf206a565"} Oct 08 14:21:58 crc kubenswrapper[4735]: I1008 14:21:58.431643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:21:58 crc kubenswrapper[4735]: I1008 14:21:58.454860 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=4.588693757 podStartE2EDuration="5.454844448s" podCreationTimestamp="2025-10-08 14:21:53 +0000 UTC" firstStartedPulling="2025-10-08 14:21:56.017173954 +0000 UTC m=+5165.825478498" lastFinishedPulling="2025-10-08 14:21:56.883324645 +0000 UTC m=+5166.691629189" observedRunningTime="2025-10-08 14:21:57.108972563 +0000 UTC m=+5166.917277147" watchObservedRunningTime="2025-10-08 14:21:58.454844448 +0000 UTC m=+5168.263148982" Oct 08 14:21:58 crc kubenswrapper[4735]: I1008 14:21:58.487782 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-8cpm2"] Oct 08 14:21:58 crc kubenswrapper[4735]: I1008 14:21:58.488399 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerName="dnsmasq-dns" containerID="cri-o://d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec" gracePeriod=10 Oct 08 14:21:58 crc kubenswrapper[4735]: I1008 14:21:58.911384 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.045813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-config\") pod \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.045884 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-dns-svc\") pod \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.045907 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jms6p\" (UniqueName: \"kubernetes.io/projected/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-kube-api-access-jms6p\") pod \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\" (UID: \"9d5da6e7-5ff5-4ef1-b678-412c74f96b10\") " Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.054394 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-kube-api-access-jms6p" (OuterVolumeSpecName: "kube-api-access-jms6p") pod "9d5da6e7-5ff5-4ef1-b678-412c74f96b10" (UID: "9d5da6e7-5ff5-4ef1-b678-412c74f96b10"). InnerVolumeSpecName "kube-api-access-jms6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.095699 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d5da6e7-5ff5-4ef1-b678-412c74f96b10" (UID: "9d5da6e7-5ff5-4ef1-b678-412c74f96b10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.105276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-config" (OuterVolumeSpecName: "config") pod "9d5da6e7-5ff5-4ef1-b678-412c74f96b10" (UID: "9d5da6e7-5ff5-4ef1-b678-412c74f96b10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.116096 4735 generic.go:334] "Generic (PLEG): container finished" podID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerID="d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec" exitCode=0 Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.116149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" event={"ID":"9d5da6e7-5ff5-4ef1-b678-412c74f96b10","Type":"ContainerDied","Data":"d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec"} Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.116183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" event={"ID":"9d5da6e7-5ff5-4ef1-b678-412c74f96b10","Type":"ContainerDied","Data":"c2aa5bf3c159a980b56a2dc3fc7c2579e6f8ea7deaad93145b26fcc17cee9a7e"} Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.116203 4735 scope.go:117] "RemoveContainer" containerID="d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.116216 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-8cpm2" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.149452 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.149515 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.149533 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jms6p\" (UniqueName: \"kubernetes.io/projected/9d5da6e7-5ff5-4ef1-b678-412c74f96b10-kube-api-access-jms6p\") on node \"crc\" DevicePath \"\"" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.174662 4735 scope.go:117] "RemoveContainer" containerID="6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.190029 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-8cpm2"] Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.200740 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-8cpm2"] Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.211968 4735 scope.go:117] "RemoveContainer" containerID="d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec" Oct 08 14:21:59 crc kubenswrapper[4735]: E1008 14:21:59.212545 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec\": container with ID starting with d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec not found: ID does not exist" containerID="d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.212606 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec"} err="failed to get container status \"d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec\": rpc error: code = NotFound desc = could not find container \"d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec\": container with ID starting with d295dab40729a3ed69ee193d3ade1447d290312a6c4447108efb9e890e3f39ec not found: ID does not exist" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.212638 4735 scope.go:117] "RemoveContainer" containerID="6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1" Oct 08 14:21:59 crc kubenswrapper[4735]: E1008 14:21:59.212929 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1\": container with ID starting with 6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1 not found: ID does not exist" containerID="6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1" Oct 08 14:21:59 crc kubenswrapper[4735]: I1008 14:21:59.212971 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1"} err="failed to get container status \"6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1\": rpc error: code = NotFound desc = could not find container \"6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1\": container with ID starting with 6ef988404821512e241a0705c6673868869b487258aeb87d7ad9a17d26b57ed1 not found: ID does not exist" Oct 08 14:22:00 crc kubenswrapper[4735]: I1008 14:22:00.912227 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" path="/var/lib/kubelet/pods/9d5da6e7-5ff5-4ef1-b678-412c74f96b10/volumes" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.195925 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 14:22:02 crc kubenswrapper[4735]: E1008 14:22:02.199751 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerName="dnsmasq-dns" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.199777 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerName="dnsmasq-dns" Oct 08 14:22:02 crc kubenswrapper[4735]: E1008 14:22:02.199796 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerName="init" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.199804 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerName="init" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.200026 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5da6e7-5ff5-4ef1-b678-412c74f96b10" containerName="dnsmasq-dns" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.203748 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.215180 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h7v7p" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.215438 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.215676 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.216332 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.305387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-config\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.305510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.305546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.305581 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcbdz\" (UniqueName: \"kubernetes.io/projected/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-kube-api-access-tcbdz\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.305600 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-scripts\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.407273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcbdz\" (UniqueName: \"kubernetes.io/projected/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-kube-api-access-tcbdz\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.407327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-scripts\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.407380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-config\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.407468 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.407535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.408065 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.408245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-scripts\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.408285 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-config\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.413610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.427265 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcbdz\" (UniqueName: \"kubernetes.io/projected/a2a26a15-814e-4c27-9d44-6d96a2eab8aa-kube-api-access-tcbdz\") pod \"ovn-northd-0\" (UID: \"a2a26a15-814e-4c27-9d44-6d96a2eab8aa\") " pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.536991 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 14:22:02 crc kubenswrapper[4735]: I1008 14:22:02.967444 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 14:22:02 crc kubenswrapper[4735]: W1008 14:22:02.976160 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a26a15_814e_4c27_9d44_6d96a2eab8aa.slice/crio-b76ba75695474b2c59a121538f38f29260a747f6f575147ccc402bcf0434056f WatchSource:0}: Error finding container b76ba75695474b2c59a121538f38f29260a747f6f575147ccc402bcf0434056f: Status 404 returned error can't find the container with id b76ba75695474b2c59a121538f38f29260a747f6f575147ccc402bcf0434056f Oct 08 14:22:03 crc kubenswrapper[4735]: I1008 14:22:03.148777 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a2a26a15-814e-4c27-9d44-6d96a2eab8aa","Type":"ContainerStarted","Data":"9f965ef9f03a77f6a9c07af1850b745f726305fcd0aaad135fb085aa14c185b9"} Oct 08 14:22:03 crc kubenswrapper[4735]: I1008 14:22:03.148851 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a2a26a15-814e-4c27-9d44-6d96a2eab8aa","Type":"ContainerStarted","Data":"b76ba75695474b2c59a121538f38f29260a747f6f575147ccc402bcf0434056f"} Oct 08 14:22:04 crc kubenswrapper[4735]: I1008 14:22:04.160690 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a2a26a15-814e-4c27-9d44-6d96a2eab8aa","Type":"ContainerStarted","Data":"2f6ce4f6968a4a558db1901b99a208218629b7259ad9384c4b6539e7cfc1e7cb"} Oct 08 14:22:04 crc kubenswrapper[4735]: I1008 14:22:04.162389 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 14:22:04 crc kubenswrapper[4735]: I1008 14:22:04.189236 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.189222151 podStartE2EDuration="2.189222151s" podCreationTimestamp="2025-10-08 14:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:22:04.181701638 +0000 UTC m=+5173.990006202" watchObservedRunningTime="2025-10-08 14:22:04.189222151 +0000 UTC m=+5173.997526695" Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.088164 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5j8pg"] Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.089737 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.096151 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5j8pg"] Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.198933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hftbb\" (UniqueName: \"kubernetes.io/projected/ca290902-5f0a-4309-8a8e-ce058661f43b-kube-api-access-hftbb\") pod \"keystone-db-create-5j8pg\" (UID: \"ca290902-5f0a-4309-8a8e-ce058661f43b\") " pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.300225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hftbb\" (UniqueName: \"kubernetes.io/projected/ca290902-5f0a-4309-8a8e-ce058661f43b-kube-api-access-hftbb\") pod \"keystone-db-create-5j8pg\" (UID: \"ca290902-5f0a-4309-8a8e-ce058661f43b\") " pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.325449 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hftbb\" (UniqueName: \"kubernetes.io/projected/ca290902-5f0a-4309-8a8e-ce058661f43b-kube-api-access-hftbb\") pod \"keystone-db-create-5j8pg\" (UID: \"ca290902-5f0a-4309-8a8e-ce058661f43b\") " pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.412178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:07 crc kubenswrapper[4735]: I1008 14:22:07.865562 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5j8pg"] Oct 08 14:22:07 crc kubenswrapper[4735]: W1008 14:22:07.872691 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca290902_5f0a_4309_8a8e_ce058661f43b.slice/crio-abc8602847c44bb4be3f46e2aa6d7f4f619917207c1e809a74a47433d4f82266 WatchSource:0}: Error finding container abc8602847c44bb4be3f46e2aa6d7f4f619917207c1e809a74a47433d4f82266: Status 404 returned error can't find the container with id abc8602847c44bb4be3f46e2aa6d7f4f619917207c1e809a74a47433d4f82266 Oct 08 14:22:08 crc kubenswrapper[4735]: I1008 14:22:08.195464 4735 generic.go:334] "Generic (PLEG): container finished" podID="ca290902-5f0a-4309-8a8e-ce058661f43b" containerID="c732b0515033cc45cb7cfe96ba63551437730707fcc37386391e7a4d53f6cc04" exitCode=0 Oct 08 14:22:08 crc kubenswrapper[4735]: I1008 14:22:08.195534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5j8pg" event={"ID":"ca290902-5f0a-4309-8a8e-ce058661f43b","Type":"ContainerDied","Data":"c732b0515033cc45cb7cfe96ba63551437730707fcc37386391e7a4d53f6cc04"} Oct 08 14:22:08 crc kubenswrapper[4735]: I1008 14:22:08.195560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5j8pg" event={"ID":"ca290902-5f0a-4309-8a8e-ce058661f43b","Type":"ContainerStarted","Data":"abc8602847c44bb4be3f46e2aa6d7f4f619917207c1e809a74a47433d4f82266"} Oct 08 14:22:09 crc kubenswrapper[4735]: I1008 14:22:09.543904 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:09 crc kubenswrapper[4735]: I1008 14:22:09.645589 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hftbb\" (UniqueName: \"kubernetes.io/projected/ca290902-5f0a-4309-8a8e-ce058661f43b-kube-api-access-hftbb\") pod \"ca290902-5f0a-4309-8a8e-ce058661f43b\" (UID: \"ca290902-5f0a-4309-8a8e-ce058661f43b\") " Oct 08 14:22:09 crc kubenswrapper[4735]: I1008 14:22:09.651778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca290902-5f0a-4309-8a8e-ce058661f43b-kube-api-access-hftbb" (OuterVolumeSpecName: "kube-api-access-hftbb") pod "ca290902-5f0a-4309-8a8e-ce058661f43b" (UID: "ca290902-5f0a-4309-8a8e-ce058661f43b"). InnerVolumeSpecName "kube-api-access-hftbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:22:09 crc kubenswrapper[4735]: I1008 14:22:09.747467 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hftbb\" (UniqueName: \"kubernetes.io/projected/ca290902-5f0a-4309-8a8e-ce058661f43b-kube-api-access-hftbb\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:10 crc kubenswrapper[4735]: I1008 14:22:10.212552 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5j8pg" event={"ID":"ca290902-5f0a-4309-8a8e-ce058661f43b","Type":"ContainerDied","Data":"abc8602847c44bb4be3f46e2aa6d7f4f619917207c1e809a74a47433d4f82266"} Oct 08 14:22:10 crc kubenswrapper[4735]: I1008 14:22:10.212591 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc8602847c44bb4be3f46e2aa6d7f4f619917207c1e809a74a47433d4f82266" Oct 08 14:22:10 crc kubenswrapper[4735]: I1008 14:22:10.212589 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5j8pg" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.179340 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-02b6-account-create-l9ft7"] Oct 08 14:22:17 crc kubenswrapper[4735]: E1008 14:22:17.181736 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca290902-5f0a-4309-8a8e-ce058661f43b" containerName="mariadb-database-create" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.181757 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca290902-5f0a-4309-8a8e-ce058661f43b" containerName="mariadb-database-create" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.181958 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca290902-5f0a-4309-8a8e-ce058661f43b" containerName="mariadb-database-create" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.182562 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.184636 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.187908 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-02b6-account-create-l9ft7"] Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.303488 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbf4c\" (UniqueName: \"kubernetes.io/projected/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a-kube-api-access-cbf4c\") pod \"keystone-02b6-account-create-l9ft7\" (UID: \"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a\") " pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.405397 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbf4c\" (UniqueName: \"kubernetes.io/projected/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a-kube-api-access-cbf4c\") pod \"keystone-02b6-account-create-l9ft7\" (UID: \"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a\") " pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.424853 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbf4c\" (UniqueName: \"kubernetes.io/projected/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a-kube-api-access-cbf4c\") pod \"keystone-02b6-account-create-l9ft7\" (UID: \"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a\") " pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.500613 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.617702 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 14:22:17 crc kubenswrapper[4735]: I1008 14:22:17.819540 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-02b6-account-create-l9ft7"] Oct 08 14:22:17 crc kubenswrapper[4735]: W1008 14:22:17.821215 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87efdaa7_f0fd_4b41_9bd8_0a95aa9dce0a.slice/crio-e33b25c960e9f9364a71f41799fd20a11551acb546535eb783a88e0669e452ff WatchSource:0}: Error finding container e33b25c960e9f9364a71f41799fd20a11551acb546535eb783a88e0669e452ff: Status 404 returned error can't find the container with id e33b25c960e9f9364a71f41799fd20a11551acb546535eb783a88e0669e452ff Oct 08 14:22:18 crc kubenswrapper[4735]: I1008 14:22:18.278162 4735 generic.go:334] "Generic (PLEG): container finished" podID="87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a" containerID="2f7f2090dcdea4d809fea56bb2f4782fce17d783c9044840f899cc1a4468d737" exitCode=0 Oct 08 14:22:18 crc kubenswrapper[4735]: I1008 14:22:18.278210 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-02b6-account-create-l9ft7" event={"ID":"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a","Type":"ContainerDied","Data":"2f7f2090dcdea4d809fea56bb2f4782fce17d783c9044840f899cc1a4468d737"} Oct 08 14:22:18 crc kubenswrapper[4735]: I1008 14:22:18.278234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-02b6-account-create-l9ft7" event={"ID":"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a","Type":"ContainerStarted","Data":"e33b25c960e9f9364a71f41799fd20a11551acb546535eb783a88e0669e452ff"} Oct 08 14:22:19 crc kubenswrapper[4735]: I1008 14:22:19.648193 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:19 crc kubenswrapper[4735]: I1008 14:22:19.757506 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbf4c\" (UniqueName: \"kubernetes.io/projected/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a-kube-api-access-cbf4c\") pod \"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a\" (UID: \"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a\") " Oct 08 14:22:19 crc kubenswrapper[4735]: I1008 14:22:19.769288 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a-kube-api-access-cbf4c" (OuterVolumeSpecName: "kube-api-access-cbf4c") pod "87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a" (UID: "87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a"). InnerVolumeSpecName "kube-api-access-cbf4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:22:19 crc kubenswrapper[4735]: I1008 14:22:19.858817 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbf4c\" (UniqueName: \"kubernetes.io/projected/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a-kube-api-access-cbf4c\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:20 crc kubenswrapper[4735]: I1008 14:22:20.295267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-02b6-account-create-l9ft7" event={"ID":"87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a","Type":"ContainerDied","Data":"e33b25c960e9f9364a71f41799fd20a11551acb546535eb783a88e0669e452ff"} Oct 08 14:22:20 crc kubenswrapper[4735]: I1008 14:22:20.295624 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e33b25c960e9f9364a71f41799fd20a11551acb546535eb783a88e0669e452ff" Oct 08 14:22:20 crc kubenswrapper[4735]: I1008 14:22:20.295313 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-02b6-account-create-l9ft7" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.635773 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2x86g"] Oct 08 14:22:22 crc kubenswrapper[4735]: E1008 14:22:22.637350 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a" containerName="mariadb-account-create" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.637444 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a" containerName="mariadb-account-create" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.637723 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a" containerName="mariadb-account-create" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.638473 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.644800 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.645011 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.645171 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92s88" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.646312 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2x86g"] Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.647066 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.808893 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-config-data\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.808998 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-combined-ca-bundle\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.809094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcgz6\" (UniqueName: \"kubernetes.io/projected/4e5b0450-418e-4c96-8cda-c8086a3d6981-kube-api-access-zcgz6\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.910548 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-config-data\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.910628 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-combined-ca-bundle\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.910694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcgz6\" (UniqueName: \"kubernetes.io/projected/4e5b0450-418e-4c96-8cda-c8086a3d6981-kube-api-access-zcgz6\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.918121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-combined-ca-bundle\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.918362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-config-data\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.930428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcgz6\" (UniqueName: \"kubernetes.io/projected/4e5b0450-418e-4c96-8cda-c8086a3d6981-kube-api-access-zcgz6\") pod \"keystone-db-sync-2x86g\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:22 crc kubenswrapper[4735]: I1008 14:22:22.954703 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:23 crc kubenswrapper[4735]: I1008 14:22:23.395656 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2x86g"] Oct 08 14:22:24 crc kubenswrapper[4735]: I1008 14:22:24.329940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2x86g" event={"ID":"4e5b0450-418e-4c96-8cda-c8086a3d6981","Type":"ContainerStarted","Data":"6089c013b279a7aaeb6b15cb8b1757b367fb3cad8b0c694b7e57bcdad4ae6a58"} Oct 08 14:22:24 crc kubenswrapper[4735]: I1008 14:22:24.330310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2x86g" event={"ID":"4e5b0450-418e-4c96-8cda-c8086a3d6981","Type":"ContainerStarted","Data":"22ac15dc5400036ac7358a3549ff0d54572e26a54bf805d889863c16fdc9ac1e"} Oct 08 14:22:24 crc kubenswrapper[4735]: I1008 14:22:24.347910 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2x86g" podStartSLOduration=2.347892682 podStartE2EDuration="2.347892682s" podCreationTimestamp="2025-10-08 14:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:22:24.346142827 +0000 UTC m=+5194.154447371" watchObservedRunningTime="2025-10-08 14:22:24.347892682 +0000 UTC m=+5194.156197226" Oct 08 14:22:25 crc kubenswrapper[4735]: I1008 14:22:25.344184 4735 generic.go:334] "Generic (PLEG): container finished" podID="4e5b0450-418e-4c96-8cda-c8086a3d6981" containerID="6089c013b279a7aaeb6b15cb8b1757b367fb3cad8b0c694b7e57bcdad4ae6a58" exitCode=0 Oct 08 14:22:25 crc kubenswrapper[4735]: I1008 14:22:25.344291 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2x86g" event={"ID":"4e5b0450-418e-4c96-8cda-c8086a3d6981","Type":"ContainerDied","Data":"6089c013b279a7aaeb6b15cb8b1757b367fb3cad8b0c694b7e57bcdad4ae6a58"} Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.699650 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.778446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-combined-ca-bundle\") pod \"4e5b0450-418e-4c96-8cda-c8086a3d6981\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.778506 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-config-data\") pod \"4e5b0450-418e-4c96-8cda-c8086a3d6981\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.778600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcgz6\" (UniqueName: \"kubernetes.io/projected/4e5b0450-418e-4c96-8cda-c8086a3d6981-kube-api-access-zcgz6\") pod \"4e5b0450-418e-4c96-8cda-c8086a3d6981\" (UID: \"4e5b0450-418e-4c96-8cda-c8086a3d6981\") " Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.788648 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5b0450-418e-4c96-8cda-c8086a3d6981-kube-api-access-zcgz6" (OuterVolumeSpecName: "kube-api-access-zcgz6") pod "4e5b0450-418e-4c96-8cda-c8086a3d6981" (UID: "4e5b0450-418e-4c96-8cda-c8086a3d6981"). InnerVolumeSpecName "kube-api-access-zcgz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.801722 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e5b0450-418e-4c96-8cda-c8086a3d6981" (UID: "4e5b0450-418e-4c96-8cda-c8086a3d6981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.816370 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-config-data" (OuterVolumeSpecName: "config-data") pod "4e5b0450-418e-4c96-8cda-c8086a3d6981" (UID: "4e5b0450-418e-4c96-8cda-c8086a3d6981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.879398 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcgz6\" (UniqueName: \"kubernetes.io/projected/4e5b0450-418e-4c96-8cda-c8086a3d6981-kube-api-access-zcgz6\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.879428 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:26 crc kubenswrapper[4735]: I1008 14:22:26.879438 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b0450-418e-4c96-8cda-c8086a3d6981-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.372988 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2x86g" event={"ID":"4e5b0450-418e-4c96-8cda-c8086a3d6981","Type":"ContainerDied","Data":"22ac15dc5400036ac7358a3549ff0d54572e26a54bf805d889863c16fdc9ac1e"} Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.373358 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ac15dc5400036ac7358a3549ff0d54572e26a54bf805d889863c16fdc9ac1e" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.373112 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2x86g" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.601045 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b44f7b875-5n8db"] Oct 08 14:22:27 crc kubenswrapper[4735]: E1008 14:22:27.601702 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5b0450-418e-4c96-8cda-c8086a3d6981" containerName="keystone-db-sync" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.602304 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5b0450-418e-4c96-8cda-c8086a3d6981" containerName="keystone-db-sync" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.602646 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5b0450-418e-4c96-8cda-c8086a3d6981" containerName="keystone-db-sync" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.603867 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.620837 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b44f7b875-5n8db"] Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.653066 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7j229"] Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.654031 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.656215 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.656414 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.656567 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.657559 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92s88" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.689611 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7j229"] Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.693431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-nb\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.693631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-config\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.693721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-dns-svc\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.693749 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-sb\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.693773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqk7\" (UniqueName: \"kubernetes.io/projected/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-kube-api-access-jqqk7\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-credential-keys\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-dns-svc\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-sb\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqk7\" (UniqueName: \"kubernetes.io/projected/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-kube-api-access-jqqk7\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795230 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shcj5\" (UniqueName: \"kubernetes.io/projected/5eb66afd-88a8-4436-b307-7be4b08f8e97-kube-api-access-shcj5\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-nb\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-scripts\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-config-data\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795316 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-fernet-keys\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-combined-ca-bundle\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.795369 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-config\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.796365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-config\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.796912 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-dns-svc\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.797415 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-sb\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.798258 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-nb\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.818184 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqk7\" (UniqueName: \"kubernetes.io/projected/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-kube-api-access-jqqk7\") pod \"dnsmasq-dns-5b44f7b875-5n8db\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.897122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shcj5\" (UniqueName: \"kubernetes.io/projected/5eb66afd-88a8-4436-b307-7be4b08f8e97-kube-api-access-shcj5\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.897221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-scripts\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.897274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-config-data\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.897302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-fernet-keys\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.897365 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-combined-ca-bundle\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.897411 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-credential-keys\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.902372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-fernet-keys\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.904827 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-scripts\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.905514 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-combined-ca-bundle\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.906130 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-credential-keys\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.908053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-config-data\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.916993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shcj5\" (UniqueName: \"kubernetes.io/projected/5eb66afd-88a8-4436-b307-7be4b08f8e97-kube-api-access-shcj5\") pod \"keystone-bootstrap-7j229\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.933933 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:27 crc kubenswrapper[4735]: I1008 14:22:27.971379 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:28 crc kubenswrapper[4735]: I1008 14:22:28.463824 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b44f7b875-5n8db"] Oct 08 14:22:28 crc kubenswrapper[4735]: W1008 14:22:28.465275 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b2e7a09_4a86_4f4b_b0d6_9611284faaca.slice/crio-c395bbe3db1ddbae7fb9c90ba7788e55d0c2d0aac278708fa36153c12efb4067 WatchSource:0}: Error finding container c395bbe3db1ddbae7fb9c90ba7788e55d0c2d0aac278708fa36153c12efb4067: Status 404 returned error can't find the container with id c395bbe3db1ddbae7fb9c90ba7788e55d0c2d0aac278708fa36153c12efb4067 Oct 08 14:22:28 crc kubenswrapper[4735]: W1008 14:22:28.465416 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eb66afd_88a8_4436_b307_7be4b08f8e97.slice/crio-80f61008e9c380e998f2821a1c099da035639bbfc32fdaf79582603c65018d03 WatchSource:0}: Error finding container 80f61008e9c380e998f2821a1c099da035639bbfc32fdaf79582603c65018d03: Status 404 returned error can't find the container with id 80f61008e9c380e998f2821a1c099da035639bbfc32fdaf79582603c65018d03 Oct 08 14:22:28 crc kubenswrapper[4735]: I1008 14:22:28.469933 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7j229"] Oct 08 14:22:29 crc kubenswrapper[4735]: I1008 14:22:29.389646 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7j229" event={"ID":"5eb66afd-88a8-4436-b307-7be4b08f8e97","Type":"ContainerStarted","Data":"e4611f988abc1ce086cc4ac477391d8ab29b6b87fb214bae70fb6d991cdaf321"} Oct 08 14:22:29 crc kubenswrapper[4735]: I1008 14:22:29.390023 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7j229" event={"ID":"5eb66afd-88a8-4436-b307-7be4b08f8e97","Type":"ContainerStarted","Data":"80f61008e9c380e998f2821a1c099da035639bbfc32fdaf79582603c65018d03"} Oct 08 14:22:29 crc kubenswrapper[4735]: I1008 14:22:29.393085 4735 generic.go:334] "Generic (PLEG): container finished" podID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerID="c013178bb7d6c448b65afc7594c9fe14dfee395542fe7307d13cff538a917934" exitCode=0 Oct 08 14:22:29 crc kubenswrapper[4735]: I1008 14:22:29.393114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" event={"ID":"8b2e7a09-4a86-4f4b-b0d6-9611284faaca","Type":"ContainerDied","Data":"c013178bb7d6c448b65afc7594c9fe14dfee395542fe7307d13cff538a917934"} Oct 08 14:22:29 crc kubenswrapper[4735]: I1008 14:22:29.393131 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" event={"ID":"8b2e7a09-4a86-4f4b-b0d6-9611284faaca","Type":"ContainerStarted","Data":"c395bbe3db1ddbae7fb9c90ba7788e55d0c2d0aac278708fa36153c12efb4067"} Oct 08 14:22:29 crc kubenswrapper[4735]: I1008 14:22:29.416315 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7j229" podStartSLOduration=2.416301201 podStartE2EDuration="2.416301201s" podCreationTimestamp="2025-10-08 14:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:22:29.412688829 +0000 UTC m=+5199.220993383" watchObservedRunningTime="2025-10-08 14:22:29.416301201 +0000 UTC m=+5199.224605735" Oct 08 14:22:30 crc kubenswrapper[4735]: I1008 14:22:30.403271 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" event={"ID":"8b2e7a09-4a86-4f4b-b0d6-9611284faaca","Type":"ContainerStarted","Data":"62f718a1ae614b628fd716a84d297856ce4cdae04cd5c92fba6be937fc144b52"} Oct 08 14:22:30 crc kubenswrapper[4735]: I1008 14:22:30.403566 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:30 crc kubenswrapper[4735]: I1008 14:22:30.419058 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" podStartSLOduration=3.419041369 podStartE2EDuration="3.419041369s" podCreationTimestamp="2025-10-08 14:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:22:30.417818078 +0000 UTC m=+5200.226122622" watchObservedRunningTime="2025-10-08 14:22:30.419041369 +0000 UTC m=+5200.227345913" Oct 08 14:22:32 crc kubenswrapper[4735]: I1008 14:22:32.422656 4735 generic.go:334] "Generic (PLEG): container finished" podID="5eb66afd-88a8-4436-b307-7be4b08f8e97" containerID="e4611f988abc1ce086cc4ac477391d8ab29b6b87fb214bae70fb6d991cdaf321" exitCode=0 Oct 08 14:22:32 crc kubenswrapper[4735]: I1008 14:22:32.422767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7j229" event={"ID":"5eb66afd-88a8-4436-b307-7be4b08f8e97","Type":"ContainerDied","Data":"e4611f988abc1ce086cc4ac477391d8ab29b6b87fb214bae70fb6d991cdaf321"} Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.743755 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.901044 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-scripts\") pod \"5eb66afd-88a8-4436-b307-7be4b08f8e97\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.901123 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-credential-keys\") pod \"5eb66afd-88a8-4436-b307-7be4b08f8e97\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.901148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-fernet-keys\") pod \"5eb66afd-88a8-4436-b307-7be4b08f8e97\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.901202 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shcj5\" (UniqueName: \"kubernetes.io/projected/5eb66afd-88a8-4436-b307-7be4b08f8e97-kube-api-access-shcj5\") pod \"5eb66afd-88a8-4436-b307-7be4b08f8e97\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.901295 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-config-data\") pod \"5eb66afd-88a8-4436-b307-7be4b08f8e97\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.901332 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-combined-ca-bundle\") pod \"5eb66afd-88a8-4436-b307-7be4b08f8e97\" (UID: \"5eb66afd-88a8-4436-b307-7be4b08f8e97\") " Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.907203 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-scripts" (OuterVolumeSpecName: "scripts") pod "5eb66afd-88a8-4436-b307-7be4b08f8e97" (UID: "5eb66afd-88a8-4436-b307-7be4b08f8e97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.907394 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb66afd-88a8-4436-b307-7be4b08f8e97-kube-api-access-shcj5" (OuterVolumeSpecName: "kube-api-access-shcj5") pod "5eb66afd-88a8-4436-b307-7be4b08f8e97" (UID: "5eb66afd-88a8-4436-b307-7be4b08f8e97"). InnerVolumeSpecName "kube-api-access-shcj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.908428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5eb66afd-88a8-4436-b307-7be4b08f8e97" (UID: "5eb66afd-88a8-4436-b307-7be4b08f8e97"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.911628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5eb66afd-88a8-4436-b307-7be4b08f8e97" (UID: "5eb66afd-88a8-4436-b307-7be4b08f8e97"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.928158 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-config-data" (OuterVolumeSpecName: "config-data") pod "5eb66afd-88a8-4436-b307-7be4b08f8e97" (UID: "5eb66afd-88a8-4436-b307-7be4b08f8e97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:33 crc kubenswrapper[4735]: I1008 14:22:33.928717 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eb66afd-88a8-4436-b307-7be4b08f8e97" (UID: "5eb66afd-88a8-4436-b307-7be4b08f8e97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.003521 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.003883 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.003958 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.004049 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shcj5\" (UniqueName: \"kubernetes.io/projected/5eb66afd-88a8-4436-b307-7be4b08f8e97-kube-api-access-shcj5\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.004126 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.004195 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb66afd-88a8-4436-b307-7be4b08f8e97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.439965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7j229" event={"ID":"5eb66afd-88a8-4436-b307-7be4b08f8e97","Type":"ContainerDied","Data":"80f61008e9c380e998f2821a1c099da035639bbfc32fdaf79582603c65018d03"} Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.440299 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80f61008e9c380e998f2821a1c099da035639bbfc32fdaf79582603c65018d03" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.440212 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7j229" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.513685 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7j229"] Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.523932 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7j229"] Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.604122 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wl5lz"] Oct 08 14:22:34 crc kubenswrapper[4735]: E1008 14:22:34.605491 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb66afd-88a8-4436-b307-7be4b08f8e97" containerName="keystone-bootstrap" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.605514 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb66afd-88a8-4436-b307-7be4b08f8e97" containerName="keystone-bootstrap" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.605710 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb66afd-88a8-4436-b307-7be4b08f8e97" containerName="keystone-bootstrap" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.606274 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.609179 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92s88" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.609328 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.609595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.609768 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.620238 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wl5lz"] Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.715301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-config-data\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.715882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7spb\" (UniqueName: \"kubernetes.io/projected/e0714d73-4ad0-4b09-a5e6-9890133d3915-kube-api-access-z7spb\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.716130 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-scripts\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.716279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-combined-ca-bundle\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.716422 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-credential-keys\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.716644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-fernet-keys\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.818052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-config-data\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.818717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7spb\" (UniqueName: \"kubernetes.io/projected/e0714d73-4ad0-4b09-a5e6-9890133d3915-kube-api-access-z7spb\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.818790 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-scripts\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.818815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-combined-ca-bundle\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.818846 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-credential-keys\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.818945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-fernet-keys\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.824030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-fernet-keys\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.824357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-combined-ca-bundle\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.824889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-credential-keys\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.825007 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-scripts\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.825401 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-config-data\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.836618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7spb\" (UniqueName: \"kubernetes.io/projected/e0714d73-4ad0-4b09-a5e6-9890133d3915-kube-api-access-z7spb\") pod \"keystone-bootstrap-wl5lz\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.925208 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb66afd-88a8-4436-b307-7be4b08f8e97" path="/var/lib/kubelet/pods/5eb66afd-88a8-4436-b307-7be4b08f8e97/volumes" Oct 08 14:22:34 crc kubenswrapper[4735]: I1008 14:22:34.925886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:35 crc kubenswrapper[4735]: I1008 14:22:35.334178 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wl5lz"] Oct 08 14:22:35 crc kubenswrapper[4735]: I1008 14:22:35.449122 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wl5lz" event={"ID":"e0714d73-4ad0-4b09-a5e6-9890133d3915","Type":"ContainerStarted","Data":"fce4e0fe7aae77c56490a447a9dcb01af9c7500b1ff8617853cb06abdb89f7db"} Oct 08 14:22:36 crc kubenswrapper[4735]: I1008 14:22:36.462280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wl5lz" event={"ID":"e0714d73-4ad0-4b09-a5e6-9890133d3915","Type":"ContainerStarted","Data":"a854b918e0f96f331795a3a94ed44712126896647447a0e726c430bb8469b6be"} Oct 08 14:22:36 crc kubenswrapper[4735]: I1008 14:22:36.485269 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wl5lz" podStartSLOduration=2.48525117 podStartE2EDuration="2.48525117s" podCreationTimestamp="2025-10-08 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:22:36.476709661 +0000 UTC m=+5206.285014215" watchObservedRunningTime="2025-10-08 14:22:36.48525117 +0000 UTC m=+5206.293555714" Oct 08 14:22:37 crc kubenswrapper[4735]: I1008 14:22:37.935695 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.002023 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd7df4955-5hbtd"] Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.002263 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="dnsmasq-dns" containerID="cri-o://9bb175713c4d9d6b792e2733a9fecfe2c2b546792b6b14bd9709fbd579ff9706" gracePeriod=10 Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.477737 4735 generic.go:334] "Generic (PLEG): container finished" podID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerID="9bb175713c4d9d6b792e2733a9fecfe2c2b546792b6b14bd9709fbd579ff9706" exitCode=0 Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.477807 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" event={"ID":"8da7ac61-6a48-4698-95b4-66fc7a6411b4","Type":"ContainerDied","Data":"9bb175713c4d9d6b792e2733a9fecfe2c2b546792b6b14bd9709fbd579ff9706"} Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.478065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" event={"ID":"8da7ac61-6a48-4698-95b4-66fc7a6411b4","Type":"ContainerDied","Data":"3d8d11e6b2254c2f65bafdd7c67b7ab2ac466ce652e070b8d2263b0c14a4771f"} Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.478087 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d8d11e6b2254c2f65bafdd7c67b7ab2ac466ce652e070b8d2263b0c14a4771f" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.480850 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0714d73-4ad0-4b09-a5e6-9890133d3915" containerID="a854b918e0f96f331795a3a94ed44712126896647447a0e726c430bb8469b6be" exitCode=0 Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.480883 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wl5lz" event={"ID":"e0714d73-4ad0-4b09-a5e6-9890133d3915","Type":"ContainerDied","Data":"a854b918e0f96f331795a3a94ed44712126896647447a0e726c430bb8469b6be"} Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.502575 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.653148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-dns-svc\") pod \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.653212 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jvf8\" (UniqueName: \"kubernetes.io/projected/8da7ac61-6a48-4698-95b4-66fc7a6411b4-kube-api-access-2jvf8\") pod \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.653255 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-sb\") pod \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.653276 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-config\") pod \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.653390 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-nb\") pod \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\" (UID: \"8da7ac61-6a48-4698-95b4-66fc7a6411b4\") " Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.665773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da7ac61-6a48-4698-95b4-66fc7a6411b4-kube-api-access-2jvf8" (OuterVolumeSpecName: "kube-api-access-2jvf8") pod "8da7ac61-6a48-4698-95b4-66fc7a6411b4" (UID: "8da7ac61-6a48-4698-95b4-66fc7a6411b4"). InnerVolumeSpecName "kube-api-access-2jvf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.693657 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8da7ac61-6a48-4698-95b4-66fc7a6411b4" (UID: "8da7ac61-6a48-4698-95b4-66fc7a6411b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.694422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-config" (OuterVolumeSpecName: "config") pod "8da7ac61-6a48-4698-95b4-66fc7a6411b4" (UID: "8da7ac61-6a48-4698-95b4-66fc7a6411b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.694837 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8da7ac61-6a48-4698-95b4-66fc7a6411b4" (UID: "8da7ac61-6a48-4698-95b4-66fc7a6411b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.697075 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8da7ac61-6a48-4698-95b4-66fc7a6411b4" (UID: "8da7ac61-6a48-4698-95b4-66fc7a6411b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.757774 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.757815 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jvf8\" (UniqueName: \"kubernetes.io/projected/8da7ac61-6a48-4698-95b4-66fc7a6411b4-kube-api-access-2jvf8\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.757830 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.757843 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:38 crc kubenswrapper[4735]: I1008 14:22:38.757856 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da7ac61-6a48-4698-95b4-66fc7a6411b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.488039 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.526881 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd7df4955-5hbtd"] Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.539129 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd7df4955-5hbtd"] Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.791858 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.875243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-config-data\") pod \"e0714d73-4ad0-4b09-a5e6-9890133d3915\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.875330 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-fernet-keys\") pod \"e0714d73-4ad0-4b09-a5e6-9890133d3915\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.875364 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7spb\" (UniqueName: \"kubernetes.io/projected/e0714d73-4ad0-4b09-a5e6-9890133d3915-kube-api-access-z7spb\") pod \"e0714d73-4ad0-4b09-a5e6-9890133d3915\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.875416 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-combined-ca-bundle\") pod \"e0714d73-4ad0-4b09-a5e6-9890133d3915\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.875466 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-credential-keys\") pod \"e0714d73-4ad0-4b09-a5e6-9890133d3915\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.875628 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-scripts\") pod \"e0714d73-4ad0-4b09-a5e6-9890133d3915\" (UID: \"e0714d73-4ad0-4b09-a5e6-9890133d3915\") " Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.879931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-scripts" (OuterVolumeSpecName: "scripts") pod "e0714d73-4ad0-4b09-a5e6-9890133d3915" (UID: "e0714d73-4ad0-4b09-a5e6-9890133d3915"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.880236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e0714d73-4ad0-4b09-a5e6-9890133d3915" (UID: "e0714d73-4ad0-4b09-a5e6-9890133d3915"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.880550 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e0714d73-4ad0-4b09-a5e6-9890133d3915" (UID: "e0714d73-4ad0-4b09-a5e6-9890133d3915"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.881973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0714d73-4ad0-4b09-a5e6-9890133d3915-kube-api-access-z7spb" (OuterVolumeSpecName: "kube-api-access-z7spb") pod "e0714d73-4ad0-4b09-a5e6-9890133d3915" (UID: "e0714d73-4ad0-4b09-a5e6-9890133d3915"). InnerVolumeSpecName "kube-api-access-z7spb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.896773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-config-data" (OuterVolumeSpecName: "config-data") pod "e0714d73-4ad0-4b09-a5e6-9890133d3915" (UID: "e0714d73-4ad0-4b09-a5e6-9890133d3915"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.901022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0714d73-4ad0-4b09-a5e6-9890133d3915" (UID: "e0714d73-4ad0-4b09-a5e6-9890133d3915"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.978798 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.978843 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.978877 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.978889 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.978905 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7spb\" (UniqueName: \"kubernetes.io/projected/e0714d73-4ad0-4b09-a5e6-9890133d3915-kube-api-access-z7spb\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:39 crc kubenswrapper[4735]: I1008 14:22:39.978925 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0714d73-4ad0-4b09-a5e6-9890133d3915-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.497111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wl5lz" event={"ID":"e0714d73-4ad0-4b09-a5e6-9890133d3915","Type":"ContainerDied","Data":"fce4e0fe7aae77c56490a447a9dcb01af9c7500b1ff8617853cb06abdb89f7db"} Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.497387 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce4e0fe7aae77c56490a447a9dcb01af9c7500b1ff8617853cb06abdb89f7db" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.497266 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wl5lz" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.675610 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9c68cdbbf-77rxj"] Oct 08 14:22:40 crc kubenswrapper[4735]: E1008 14:22:40.676002 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="init" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.676024 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="init" Oct 08 14:22:40 crc kubenswrapper[4735]: E1008 14:22:40.676043 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0714d73-4ad0-4b09-a5e6-9890133d3915" containerName="keystone-bootstrap" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.676054 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0714d73-4ad0-4b09-a5e6-9890133d3915" containerName="keystone-bootstrap" Oct 08 14:22:40 crc kubenswrapper[4735]: E1008 14:22:40.676083 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="dnsmasq-dns" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.676092 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="dnsmasq-dns" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.676305 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="dnsmasq-dns" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.676322 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0714d73-4ad0-4b09-a5e6-9890133d3915" containerName="keystone-bootstrap" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.677078 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.680260 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.680265 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.680348 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.680436 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92s88" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.690846 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbpf\" (UniqueName: \"kubernetes.io/projected/8e627d7b-3491-4f1e-8257-df71e34472c2-kube-api-access-pdbpf\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.690909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-credential-keys\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.691017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-config-data\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.691090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-combined-ca-bundle\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.691117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-fernet-keys\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.691153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-scripts\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.698223 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9c68cdbbf-77rxj"] Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.792154 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-combined-ca-bundle\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.792200 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-fernet-keys\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.792253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-scripts\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.792315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbpf\" (UniqueName: \"kubernetes.io/projected/8e627d7b-3491-4f1e-8257-df71e34472c2-kube-api-access-pdbpf\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.792347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-credential-keys\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.792406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-config-data\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.797050 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-fernet-keys\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.797089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-config-data\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.797225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-credential-keys\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.798271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-scripts\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.801409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e627d7b-3491-4f1e-8257-df71e34472c2-combined-ca-bundle\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.808791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbpf\" (UniqueName: \"kubernetes.io/projected/8e627d7b-3491-4f1e-8257-df71e34472c2-kube-api-access-pdbpf\") pod \"keystone-9c68cdbbf-77rxj\" (UID: \"8e627d7b-3491-4f1e-8257-df71e34472c2\") " pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.909013 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" path="/var/lib/kubelet/pods/8da7ac61-6a48-4698-95b4-66fc7a6411b4/volumes" Oct 08 14:22:40 crc kubenswrapper[4735]: I1008 14:22:40.998178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:41 crc kubenswrapper[4735]: I1008 14:22:41.409567 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9c68cdbbf-77rxj"] Oct 08 14:22:41 crc kubenswrapper[4735]: I1008 14:22:41.506117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c68cdbbf-77rxj" event={"ID":"8e627d7b-3491-4f1e-8257-df71e34472c2","Type":"ContainerStarted","Data":"7c89182aa8892b9bd4f5d8b41957421b11f88d1bead2e98d16a7160f4a49fa5b"} Oct 08 14:22:42 crc kubenswrapper[4735]: I1008 14:22:42.514838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c68cdbbf-77rxj" event={"ID":"8e627d7b-3491-4f1e-8257-df71e34472c2","Type":"ContainerStarted","Data":"cccf221fb86acaf9c67ed3e7ba09c58d6dde4d4b2eb749ba80ee987e700e52e6"} Oct 08 14:22:42 crc kubenswrapper[4735]: I1008 14:22:42.515182 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:22:42 crc kubenswrapper[4735]: I1008 14:22:42.531513 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9c68cdbbf-77rxj" podStartSLOduration=2.531492659 podStartE2EDuration="2.531492659s" podCreationTimestamp="2025-10-08 14:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:22:42.528036161 +0000 UTC m=+5212.336340705" watchObservedRunningTime="2025-10-08 14:22:42.531492659 +0000 UTC m=+5212.339797203" Oct 08 14:22:43 crc kubenswrapper[4735]: I1008 14:22:43.431467 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fd7df4955-5hbtd" podUID="8da7ac61-6a48-4698-95b4-66fc7a6411b4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.10:5353: i/o timeout" Oct 08 14:22:54 crc kubenswrapper[4735]: I1008 14:22:54.878119 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:22:54 crc kubenswrapper[4735]: I1008 14:22:54.878752 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:23:08 crc kubenswrapper[4735]: I1008 14:23:08.051110 4735 scope.go:117] "RemoveContainer" containerID="eb4baaed5dd84fb5d77403d2b189778946b2017028be44318bd1b02c33c5d2e0" Oct 08 14:23:08 crc kubenswrapper[4735]: I1008 14:23:08.070956 4735 scope.go:117] "RemoveContainer" containerID="b8e3804282a936819b3b77811127b440386c29c03adb1a7a0869f338f775a374" Oct 08 14:23:08 crc kubenswrapper[4735]: I1008 14:23:08.109639 4735 scope.go:117] "RemoveContainer" containerID="bc71e6776f2fd9437e4b5b175132e76da479f28c56fedfd1da7ebdb3e5c4754d" Oct 08 14:23:12 crc kubenswrapper[4735]: I1008 14:23:12.503783 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-9c68cdbbf-77rxj" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.244094 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.245742 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.247749 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.247952 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.259230 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-v4rm7" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.263559 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.377614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.377946 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.377977 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfg5m\" (UniqueName: \"kubernetes.io/projected/c5d16f46-7748-41b5-9dfc-dd0e983a562c-kube-api-access-kfg5m\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.479422 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.479567 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.479596 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfg5m\" (UniqueName: \"kubernetes.io/projected/c5d16f46-7748-41b5-9dfc-dd0e983a562c-kube-api-access-kfg5m\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.480453 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.486276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.498999 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfg5m\" (UniqueName: \"kubernetes.io/projected/c5d16f46-7748-41b5-9dfc-dd0e983a562c-kube-api-access-kfg5m\") pod \"openstackclient\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.567375 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.803538 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 14:23:17 crc kubenswrapper[4735]: I1008 14:23:17.827585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c5d16f46-7748-41b5-9dfc-dd0e983a562c","Type":"ContainerStarted","Data":"a43ea7bd0348ee97a05d3f1c9bf373f7d3d3cea65e9e781476b1b2caf21599b2"} Oct 08 14:23:18 crc kubenswrapper[4735]: I1008 14:23:18.839649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c5d16f46-7748-41b5-9dfc-dd0e983a562c","Type":"ContainerStarted","Data":"cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901"} Oct 08 14:23:18 crc kubenswrapper[4735]: I1008 14:23:18.860503 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.8604660819999999 podStartE2EDuration="1.860466082s" podCreationTimestamp="2025-10-08 14:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:23:18.852348094 +0000 UTC m=+5248.660652638" watchObservedRunningTime="2025-10-08 14:23:18.860466082 +0000 UTC m=+5248.668770626" Oct 08 14:23:24 crc kubenswrapper[4735]: I1008 14:23:24.878148 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:23:24 crc kubenswrapper[4735]: I1008 14:23:24.878755 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:23:54 crc kubenswrapper[4735]: I1008 14:23:54.879126 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:23:54 crc kubenswrapper[4735]: I1008 14:23:54.879587 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:23:54 crc kubenswrapper[4735]: I1008 14:23:54.879631 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:23:54 crc kubenswrapper[4735]: I1008 14:23:54.880247 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:23:54 crc kubenswrapper[4735]: I1008 14:23:54.880298 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" gracePeriod=600 Oct 08 14:23:55 crc kubenswrapper[4735]: E1008 14:23:55.014839 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:23:55 crc kubenswrapper[4735]: I1008 14:23:55.145232 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" exitCode=0 Oct 08 14:23:55 crc kubenswrapper[4735]: I1008 14:23:55.145294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28"} Oct 08 14:23:55 crc kubenswrapper[4735]: I1008 14:23:55.145349 4735 scope.go:117] "RemoveContainer" containerID="cec388ee120153babb4ba1df9c16c317b59b98c70ffddd569ed770c55e343565" Oct 08 14:23:55 crc kubenswrapper[4735]: I1008 14:23:55.146343 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:23:55 crc kubenswrapper[4735]: E1008 14:23:55.146937 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.697039 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ktp9h"] Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.699601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.707338 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktp9h"] Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.755700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-catalog-content\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.755751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-utilities\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.755960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jtm\" (UniqueName: \"kubernetes.io/projected/100aec50-291d-45de-ab9a-b1d4b7dc44aa-kube-api-access-98jtm\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.857346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-catalog-content\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.857787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-utilities\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.857918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-catalog-content\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.857942 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98jtm\" (UniqueName: \"kubernetes.io/projected/100aec50-291d-45de-ab9a-b1d4b7dc44aa-kube-api-access-98jtm\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.858302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-utilities\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:57 crc kubenswrapper[4735]: I1008 14:23:57.876415 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jtm\" (UniqueName: \"kubernetes.io/projected/100aec50-291d-45de-ab9a-b1d4b7dc44aa-kube-api-access-98jtm\") pod \"community-operators-ktp9h\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:58 crc kubenswrapper[4735]: I1008 14:23:58.029605 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:23:58 crc kubenswrapper[4735]: I1008 14:23:58.538068 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktp9h"] Oct 08 14:23:59 crc kubenswrapper[4735]: I1008 14:23:59.183450 4735 generic.go:334] "Generic (PLEG): container finished" podID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerID="840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13" exitCode=0 Oct 08 14:23:59 crc kubenswrapper[4735]: I1008 14:23:59.183561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktp9h" event={"ID":"100aec50-291d-45de-ab9a-b1d4b7dc44aa","Type":"ContainerDied","Data":"840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13"} Oct 08 14:23:59 crc kubenswrapper[4735]: I1008 14:23:59.183910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktp9h" event={"ID":"100aec50-291d-45de-ab9a-b1d4b7dc44aa","Type":"ContainerStarted","Data":"4d0c3508bd3f8c010aa7d068b418bfdb3bb93e2bf79bfe27c3fa8ac9fdabd286"} Oct 08 14:23:59 crc kubenswrapper[4735]: E1008 14:23:59.747502 4735 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.194:50016->38.102.83.194:36769: read tcp 38.102.83.194:50016->38.102.83.194:36769: read: connection reset by peer Oct 08 14:24:01 crc kubenswrapper[4735]: I1008 14:24:01.201380 4735 generic.go:334] "Generic (PLEG): container finished" podID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerID="1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b" exitCode=0 Oct 08 14:24:01 crc kubenswrapper[4735]: I1008 14:24:01.201436 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktp9h" event={"ID":"100aec50-291d-45de-ab9a-b1d4b7dc44aa","Type":"ContainerDied","Data":"1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b"} Oct 08 14:24:02 crc kubenswrapper[4735]: I1008 14:24:02.211242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktp9h" event={"ID":"100aec50-291d-45de-ab9a-b1d4b7dc44aa","Type":"ContainerStarted","Data":"13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da"} Oct 08 14:24:02 crc kubenswrapper[4735]: I1008 14:24:02.233118 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ktp9h" podStartSLOduration=2.542131528 podStartE2EDuration="5.233099437s" podCreationTimestamp="2025-10-08 14:23:57 +0000 UTC" firstStartedPulling="2025-10-08 14:23:59.185518866 +0000 UTC m=+5288.993823420" lastFinishedPulling="2025-10-08 14:24:01.876486785 +0000 UTC m=+5291.684791329" observedRunningTime="2025-10-08 14:24:02.22814417 +0000 UTC m=+5292.036448724" watchObservedRunningTime="2025-10-08 14:24:02.233099437 +0000 UTC m=+5292.041403991" Oct 08 14:24:06 crc kubenswrapper[4735]: I1008 14:24:06.895306 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:24:06 crc kubenswrapper[4735]: E1008 14:24:06.896130 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:24:08 crc kubenswrapper[4735]: I1008 14:24:08.030342 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:24:08 crc kubenswrapper[4735]: I1008 14:24:08.030414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:24:08 crc kubenswrapper[4735]: I1008 14:24:08.088552 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:24:08 crc kubenswrapper[4735]: I1008 14:24:08.307443 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:24:08 crc kubenswrapper[4735]: I1008 14:24:08.888167 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktp9h"] Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.274848 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ktp9h" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="registry-server" containerID="cri-o://13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da" gracePeriod=2 Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.713326 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.892294 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-utilities\") pod \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.892354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98jtm\" (UniqueName: \"kubernetes.io/projected/100aec50-291d-45de-ab9a-b1d4b7dc44aa-kube-api-access-98jtm\") pod \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.892526 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-catalog-content\") pod \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\" (UID: \"100aec50-291d-45de-ab9a-b1d4b7dc44aa\") " Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.893070 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-utilities" (OuterVolumeSpecName: "utilities") pod "100aec50-291d-45de-ab9a-b1d4b7dc44aa" (UID: "100aec50-291d-45de-ab9a-b1d4b7dc44aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.904704 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100aec50-291d-45de-ab9a-b1d4b7dc44aa-kube-api-access-98jtm" (OuterVolumeSpecName: "kube-api-access-98jtm") pod "100aec50-291d-45de-ab9a-b1d4b7dc44aa" (UID: "100aec50-291d-45de-ab9a-b1d4b7dc44aa"). InnerVolumeSpecName "kube-api-access-98jtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.952003 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "100aec50-291d-45de-ab9a-b1d4b7dc44aa" (UID: "100aec50-291d-45de-ab9a-b1d4b7dc44aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.994291 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.994327 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100aec50-291d-45de-ab9a-b1d4b7dc44aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:24:10 crc kubenswrapper[4735]: I1008 14:24:10.994337 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98jtm\" (UniqueName: \"kubernetes.io/projected/100aec50-291d-45de-ab9a-b1d4b7dc44aa-kube-api-access-98jtm\") on node \"crc\" DevicePath \"\"" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.285068 4735 generic.go:334] "Generic (PLEG): container finished" podID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerID="13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da" exitCode=0 Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.285110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktp9h" event={"ID":"100aec50-291d-45de-ab9a-b1d4b7dc44aa","Type":"ContainerDied","Data":"13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da"} Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.285140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktp9h" event={"ID":"100aec50-291d-45de-ab9a-b1d4b7dc44aa","Type":"ContainerDied","Data":"4d0c3508bd3f8c010aa7d068b418bfdb3bb93e2bf79bfe27c3fa8ac9fdabd286"} Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.285159 4735 scope.go:117] "RemoveContainer" containerID="13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.285185 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktp9h" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.328815 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktp9h"] Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.330664 4735 scope.go:117] "RemoveContainer" containerID="1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.333703 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ktp9h"] Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.354830 4735 scope.go:117] "RemoveContainer" containerID="840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.381389 4735 scope.go:117] "RemoveContainer" containerID="13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da" Oct 08 14:24:11 crc kubenswrapper[4735]: E1008 14:24:11.381902 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da\": container with ID starting with 13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da not found: ID does not exist" containerID="13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.381959 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da"} err="failed to get container status \"13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da\": rpc error: code = NotFound desc = could not find container \"13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da\": container with ID starting with 13a39b2c00b160e199cc3a34c8e0d1ae1ba308e055dd55cd82befa5164d300da not found: ID does not exist" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.382003 4735 scope.go:117] "RemoveContainer" containerID="1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b" Oct 08 14:24:11 crc kubenswrapper[4735]: E1008 14:24:11.382457 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b\": container with ID starting with 1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b not found: ID does not exist" containerID="1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.382527 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b"} err="failed to get container status \"1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b\": rpc error: code = NotFound desc = could not find container \"1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b\": container with ID starting with 1a6f8ac4da72e22e5328a9feee461ad9f19d25f88021e55f97f1455f1449347b not found: ID does not exist" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.382549 4735 scope.go:117] "RemoveContainer" containerID="840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13" Oct 08 14:24:11 crc kubenswrapper[4735]: E1008 14:24:11.382927 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13\": container with ID starting with 840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13 not found: ID does not exist" containerID="840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13" Oct 08 14:24:11 crc kubenswrapper[4735]: I1008 14:24:11.382955 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13"} err="failed to get container status \"840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13\": rpc error: code = NotFound desc = could not find container \"840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13\": container with ID starting with 840e00c58eb957b43c612b9a7262613b96f46bb1b1aa80aff7406a69c06ade13 not found: ID does not exist" Oct 08 14:24:12 crc kubenswrapper[4735]: I1008 14:24:12.905940 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" path="/var/lib/kubelet/pods/100aec50-291d-45de-ab9a-b1d4b7dc44aa/volumes" Oct 08 14:24:17 crc kubenswrapper[4735]: I1008 14:24:17.894719 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:24:17 crc kubenswrapper[4735]: E1008 14:24:17.895624 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:24:30 crc kubenswrapper[4735]: I1008 14:24:30.900237 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:24:30 crc kubenswrapper[4735]: E1008 14:24:30.901155 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:24:43 crc kubenswrapper[4735]: I1008 14:24:43.895192 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:24:43 crc kubenswrapper[4735]: E1008 14:24:43.896110 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:24:55 crc kubenswrapper[4735]: I1008 14:24:55.894575 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:24:55 crc kubenswrapper[4735]: E1008 14:24:55.895345 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:25:00 crc kubenswrapper[4735]: I1008 14:25:00.994744 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-lgl8l"] Oct 08 14:25:00 crc kubenswrapper[4735]: E1008 14:25:00.995787 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="extract-utilities" Oct 08 14:25:00 crc kubenswrapper[4735]: I1008 14:25:00.995802 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="extract-utilities" Oct 08 14:25:00 crc kubenswrapper[4735]: E1008 14:25:00.995828 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="extract-content" Oct 08 14:25:00 crc kubenswrapper[4735]: I1008 14:25:00.995836 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="extract-content" Oct 08 14:25:00 crc kubenswrapper[4735]: E1008 14:25:00.995864 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="registry-server" Oct 08 14:25:00 crc kubenswrapper[4735]: I1008 14:25:00.995882 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="registry-server" Oct 08 14:25:00 crc kubenswrapper[4735]: I1008 14:25:00.996081 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="100aec50-291d-45de-ab9a-b1d4b7dc44aa" containerName="registry-server" Oct 08 14:25:00 crc kubenswrapper[4735]: I1008 14:25:00.996848 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:01 crc kubenswrapper[4735]: I1008 14:25:01.004205 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-lgl8l"] Oct 08 14:25:01 crc kubenswrapper[4735]: I1008 14:25:01.052958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rct\" (UniqueName: \"kubernetes.io/projected/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5-kube-api-access-t9rct\") pod \"barbican-db-create-lgl8l\" (UID: \"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5\") " pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:01 crc kubenswrapper[4735]: I1008 14:25:01.154037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rct\" (UniqueName: \"kubernetes.io/projected/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5-kube-api-access-t9rct\") pod \"barbican-db-create-lgl8l\" (UID: \"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5\") " pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:01 crc kubenswrapper[4735]: I1008 14:25:01.172975 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rct\" (UniqueName: \"kubernetes.io/projected/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5-kube-api-access-t9rct\") pod \"barbican-db-create-lgl8l\" (UID: \"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5\") " pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:01 crc kubenswrapper[4735]: I1008 14:25:01.322178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:01 crc kubenswrapper[4735]: I1008 14:25:01.761243 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-lgl8l"] Oct 08 14:25:02 crc kubenswrapper[4735]: I1008 14:25:02.732706 4735 generic.go:334] "Generic (PLEG): container finished" podID="c70df4ed-68de-4ffa-b7b5-ed86b503a3f5" containerID="bd8afd9e8bc03b690278ea08d7d53f443645ebe8c540be4725b19fe324630780" exitCode=0 Oct 08 14:25:02 crc kubenswrapper[4735]: I1008 14:25:02.732758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lgl8l" event={"ID":"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5","Type":"ContainerDied","Data":"bd8afd9e8bc03b690278ea08d7d53f443645ebe8c540be4725b19fe324630780"} Oct 08 14:25:02 crc kubenswrapper[4735]: I1008 14:25:02.732795 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lgl8l" event={"ID":"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5","Type":"ContainerStarted","Data":"48a3fd6771572a9ce3d26f17fdf619bef96170ff5550f503b96d1b5d8d8d226b"} Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.069210 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.204687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rct\" (UniqueName: \"kubernetes.io/projected/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5-kube-api-access-t9rct\") pod \"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5\" (UID: \"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5\") " Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.215764 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5-kube-api-access-t9rct" (OuterVolumeSpecName: "kube-api-access-t9rct") pod "c70df4ed-68de-4ffa-b7b5-ed86b503a3f5" (UID: "c70df4ed-68de-4ffa-b7b5-ed86b503a3f5"). InnerVolumeSpecName "kube-api-access-t9rct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.306605 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9rct\" (UniqueName: \"kubernetes.io/projected/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5-kube-api-access-t9rct\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.749961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lgl8l" event={"ID":"c70df4ed-68de-4ffa-b7b5-ed86b503a3f5","Type":"ContainerDied","Data":"48a3fd6771572a9ce3d26f17fdf619bef96170ff5550f503b96d1b5d8d8d226b"} Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.750301 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48a3fd6771572a9ce3d26f17fdf619bef96170ff5550f503b96d1b5d8d8d226b" Oct 08 14:25:04 crc kubenswrapper[4735]: I1008 14:25:04.750024 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lgl8l" Oct 08 14:25:10 crc kubenswrapper[4735]: I1008 14:25:10.902948 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:25:10 crc kubenswrapper[4735]: E1008 14:25:10.905603 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.009574 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a7f8-account-create-ccjmt"] Oct 08 14:25:11 crc kubenswrapper[4735]: E1008 14:25:11.010285 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70df4ed-68de-4ffa-b7b5-ed86b503a3f5" containerName="mariadb-database-create" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.012009 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70df4ed-68de-4ffa-b7b5-ed86b503a3f5" containerName="mariadb-database-create" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.012413 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70df4ed-68de-4ffa-b7b5-ed86b503a3f5" containerName="mariadb-database-create" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.013293 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.018458 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a7f8-account-create-ccjmt"] Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.023231 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.129622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwgn\" (UniqueName: \"kubernetes.io/projected/5d20f8ae-b94e-46da-b87a-d0f1dd60446a-kube-api-access-8cwgn\") pod \"barbican-a7f8-account-create-ccjmt\" (UID: \"5d20f8ae-b94e-46da-b87a-d0f1dd60446a\") " pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.231129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwgn\" (UniqueName: \"kubernetes.io/projected/5d20f8ae-b94e-46da-b87a-d0f1dd60446a-kube-api-access-8cwgn\") pod \"barbican-a7f8-account-create-ccjmt\" (UID: \"5d20f8ae-b94e-46da-b87a-d0f1dd60446a\") " pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.252768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwgn\" (UniqueName: \"kubernetes.io/projected/5d20f8ae-b94e-46da-b87a-d0f1dd60446a-kube-api-access-8cwgn\") pod \"barbican-a7f8-account-create-ccjmt\" (UID: \"5d20f8ae-b94e-46da-b87a-d0f1dd60446a\") " pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.337392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.780674 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a7f8-account-create-ccjmt"] Oct 08 14:25:11 crc kubenswrapper[4735]: I1008 14:25:11.810510 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7f8-account-create-ccjmt" event={"ID":"5d20f8ae-b94e-46da-b87a-d0f1dd60446a","Type":"ContainerStarted","Data":"91df1c9f7b06b59b6b88ca3641db9bb360a6e2c62e597de71e8d3239c83aa41f"} Oct 08 14:25:12 crc kubenswrapper[4735]: I1008 14:25:12.823171 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d20f8ae-b94e-46da-b87a-d0f1dd60446a" containerID="9fb8f0a9e183de25224e93ab0a66cf487fd7c63f33f507a14f089a22f9bd4e18" exitCode=0 Oct 08 14:25:12 crc kubenswrapper[4735]: I1008 14:25:12.823239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7f8-account-create-ccjmt" event={"ID":"5d20f8ae-b94e-46da-b87a-d0f1dd60446a","Type":"ContainerDied","Data":"9fb8f0a9e183de25224e93ab0a66cf487fd7c63f33f507a14f089a22f9bd4e18"} Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.134567 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.288158 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cwgn\" (UniqueName: \"kubernetes.io/projected/5d20f8ae-b94e-46da-b87a-d0f1dd60446a-kube-api-access-8cwgn\") pod \"5d20f8ae-b94e-46da-b87a-d0f1dd60446a\" (UID: \"5d20f8ae-b94e-46da-b87a-d0f1dd60446a\") " Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.295879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d20f8ae-b94e-46da-b87a-d0f1dd60446a-kube-api-access-8cwgn" (OuterVolumeSpecName: "kube-api-access-8cwgn") pod "5d20f8ae-b94e-46da-b87a-d0f1dd60446a" (UID: "5d20f8ae-b94e-46da-b87a-d0f1dd60446a"). InnerVolumeSpecName "kube-api-access-8cwgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.390711 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cwgn\" (UniqueName: \"kubernetes.io/projected/5d20f8ae-b94e-46da-b87a-d0f1dd60446a-kube-api-access-8cwgn\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.843938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7f8-account-create-ccjmt" event={"ID":"5d20f8ae-b94e-46da-b87a-d0f1dd60446a","Type":"ContainerDied","Data":"91df1c9f7b06b59b6b88ca3641db9bb360a6e2c62e597de71e8d3239c83aa41f"} Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.843996 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91df1c9f7b06b59b6b88ca3641db9bb360a6e2c62e597de71e8d3239c83aa41f" Oct 08 14:25:14 crc kubenswrapper[4735]: I1008 14:25:14.844401 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7f8-account-create-ccjmt" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.199012 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gdwxj"] Oct 08 14:25:16 crc kubenswrapper[4735]: E1008 14:25:16.199361 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d20f8ae-b94e-46da-b87a-d0f1dd60446a" containerName="mariadb-account-create" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.199373 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d20f8ae-b94e-46da-b87a-d0f1dd60446a" containerName="mariadb-account-create" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.199536 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d20f8ae-b94e-46da-b87a-d0f1dd60446a" containerName="mariadb-account-create" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.200145 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.205632 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.206197 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-scj46" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.220064 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gdwxj"] Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.319589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2p8s\" (UniqueName: \"kubernetes.io/projected/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-kube-api-access-f2p8s\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.319665 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-db-sync-config-data\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.319702 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-combined-ca-bundle\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.421142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2p8s\" (UniqueName: \"kubernetes.io/projected/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-kube-api-access-f2p8s\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.421213 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-db-sync-config-data\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.421233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-combined-ca-bundle\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.426228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-db-sync-config-data\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.432176 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-combined-ca-bundle\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.449112 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2p8s\" (UniqueName: \"kubernetes.io/projected/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-kube-api-access-f2p8s\") pod \"barbican-db-sync-gdwxj\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:16 crc kubenswrapper[4735]: I1008 14:25:16.517787 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:17 crc kubenswrapper[4735]: I1008 14:25:17.009135 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gdwxj"] Oct 08 14:25:17 crc kubenswrapper[4735]: I1008 14:25:17.870823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gdwxj" event={"ID":"fb6a010f-d416-40b7-9abf-f6f5d75e57bc","Type":"ContainerStarted","Data":"061a05e4ab595661e755871e5fb20d9078cc948dcba83a6ed3cf961ddd51ff3a"} Oct 08 14:25:17 crc kubenswrapper[4735]: I1008 14:25:17.871140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gdwxj" event={"ID":"fb6a010f-d416-40b7-9abf-f6f5d75e57bc","Type":"ContainerStarted","Data":"bf5d68d5689f21e31d4e137bb1ceb705c0aefd45a49c364108159c0b3f77fd3f"} Oct 08 14:25:17 crc kubenswrapper[4735]: I1008 14:25:17.893306 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gdwxj" podStartSLOduration=1.893284427 podStartE2EDuration="1.893284427s" podCreationTimestamp="2025-10-08 14:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:25:17.886294829 +0000 UTC m=+5367.694599383" watchObservedRunningTime="2025-10-08 14:25:17.893284427 +0000 UTC m=+5367.701588971" Oct 08 14:25:20 crc kubenswrapper[4735]: I1008 14:25:20.893409 4735 generic.go:334] "Generic (PLEG): container finished" podID="fb6a010f-d416-40b7-9abf-f6f5d75e57bc" containerID="061a05e4ab595661e755871e5fb20d9078cc948dcba83a6ed3cf961ddd51ff3a" exitCode=0 Oct 08 14:25:20 crc kubenswrapper[4735]: I1008 14:25:20.893460 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gdwxj" event={"ID":"fb6a010f-d416-40b7-9abf-f6f5d75e57bc","Type":"ContainerDied","Data":"061a05e4ab595661e755871e5fb20d9078cc948dcba83a6ed3cf961ddd51ff3a"} Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.239723 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.426761 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2p8s\" (UniqueName: \"kubernetes.io/projected/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-kube-api-access-f2p8s\") pod \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.427026 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-db-sync-config-data\") pod \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.427081 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-combined-ca-bundle\") pod \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\" (UID: \"fb6a010f-d416-40b7-9abf-f6f5d75e57bc\") " Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.433709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fb6a010f-d416-40b7-9abf-f6f5d75e57bc" (UID: "fb6a010f-d416-40b7-9abf-f6f5d75e57bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.434194 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-kube-api-access-f2p8s" (OuterVolumeSpecName: "kube-api-access-f2p8s") pod "fb6a010f-d416-40b7-9abf-f6f5d75e57bc" (UID: "fb6a010f-d416-40b7-9abf-f6f5d75e57bc"). InnerVolumeSpecName "kube-api-access-f2p8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.453606 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb6a010f-d416-40b7-9abf-f6f5d75e57bc" (UID: "fb6a010f-d416-40b7-9abf-f6f5d75e57bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.529219 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.529273 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.529288 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2p8s\" (UniqueName: \"kubernetes.io/projected/fb6a010f-d416-40b7-9abf-f6f5d75e57bc-kube-api-access-f2p8s\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.932622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gdwxj" event={"ID":"fb6a010f-d416-40b7-9abf-f6f5d75e57bc","Type":"ContainerDied","Data":"bf5d68d5689f21e31d4e137bb1ceb705c0aefd45a49c364108159c0b3f77fd3f"} Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.932693 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf5d68d5689f21e31d4e137bb1ceb705c0aefd45a49c364108159c0b3f77fd3f" Oct 08 14:25:22 crc kubenswrapper[4735]: I1008 14:25:22.932721 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gdwxj" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.154385 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d64898b78-7f66b"] Oct 08 14:25:23 crc kubenswrapper[4735]: E1008 14:25:23.155262 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6a010f-d416-40b7-9abf-f6f5d75e57bc" containerName="barbican-db-sync" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.155285 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6a010f-d416-40b7-9abf-f6f5d75e57bc" containerName="barbican-db-sync" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.155504 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6a010f-d416-40b7-9abf-f6f5d75e57bc" containerName="barbican-db-sync" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.156718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.161778 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.162024 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-scj46" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.162073 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cdc76b747-5kslq"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.162206 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.163541 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.165175 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.191723 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d64898b78-7f66b"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.232951 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cdc76b747-5kslq"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.239597 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f95665c55-m8pg7"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.241283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.263843 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f95665c55-m8pg7"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-config\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-config-data-custom\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gv5s\" (UniqueName: \"kubernetes.io/projected/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-kube-api-access-5gv5s\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-logs\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-dns-svc\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344398 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-config-data\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344463 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bcda23-f2e7-4534-a215-9f535010166b-logs\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344506 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-config-data-custom\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-nb\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344585 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-combined-ca-bundle\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344610 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-combined-ca-bundle\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-config-data\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344662 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfkzw\" (UniqueName: \"kubernetes.io/projected/365fd8a7-37d2-4f32-868a-6bc1d735df20-kube-api-access-nfkzw\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344701 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-sb\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.344730 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrgn\" (UniqueName: \"kubernetes.io/projected/a6bcda23-f2e7-4534-a215-9f535010166b-kube-api-access-wsrgn\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.351406 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79b49b68c6-7bkwh"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.353501 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.355697 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.377424 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79b49b68c6-7bkwh"] Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.447816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-logs\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.447865 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-dns-svc\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.447887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-config-data\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.447941 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bcda23-f2e7-4534-a215-9f535010166b-logs\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.447968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-config-data-custom\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.447990 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-nb\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-combined-ca-bundle\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448043 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-config-data\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-combined-ca-bundle\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448090 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfkzw\" (UniqueName: \"kubernetes.io/projected/365fd8a7-37d2-4f32-868a-6bc1d735df20-kube-api-access-nfkzw\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-sb\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrgn\" (UniqueName: \"kubernetes.io/projected/a6bcda23-f2e7-4534-a215-9f535010166b-kube-api-access-wsrgn\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-config\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448244 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-config-data-custom\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gv5s\" (UniqueName: \"kubernetes.io/projected/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-kube-api-access-5gv5s\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-logs\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bcda23-f2e7-4534-a215-9f535010166b-logs\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.450131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-sb\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.449378 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-nb\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.448815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-dns-svc\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.450926 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-config\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.453346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-combined-ca-bundle\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.453398 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-combined-ca-bundle\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.454037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-config-data-custom\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.454547 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-config-data\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.469460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-config-data-custom\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.470975 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrgn\" (UniqueName: \"kubernetes.io/projected/a6bcda23-f2e7-4534-a215-9f535010166b-kube-api-access-wsrgn\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.471170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfkzw\" (UniqueName: \"kubernetes.io/projected/365fd8a7-37d2-4f32-868a-6bc1d735df20-kube-api-access-nfkzw\") pod \"dnsmasq-dns-5f95665c55-m8pg7\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.471573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bcda23-f2e7-4534-a215-9f535010166b-config-data\") pod \"barbican-keystone-listener-7d64898b78-7f66b\" (UID: \"a6bcda23-f2e7-4534-a215-9f535010166b\") " pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.476173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gv5s\" (UniqueName: \"kubernetes.io/projected/059d447b-ce74-4cb3-9ed9-ddb4f50d983c-kube-api-access-5gv5s\") pod \"barbican-worker-7cdc76b747-5kslq\" (UID: \"059d447b-ce74-4cb3-9ed9-ddb4f50d983c\") " pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.481635 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.491364 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cdc76b747-5kslq" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.553162 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-config-data-custom\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.553325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-config-data\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.553359 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/8e685548-ea9a-40b5-b68a-812e88a9170d-kube-api-access-76p49\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.553388 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-combined-ca-bundle\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.553404 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e685548-ea9a-40b5-b68a-812e88a9170d-logs\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.567217 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.654934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-config-data\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.655292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/8e685548-ea9a-40b5-b68a-812e88a9170d-kube-api-access-76p49\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.655321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-combined-ca-bundle\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.655339 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e685548-ea9a-40b5-b68a-812e88a9170d-logs\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.655486 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-config-data-custom\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.655999 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e685548-ea9a-40b5-b68a-812e88a9170d-logs\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.660404 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-config-data-custom\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.660798 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-combined-ca-bundle\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.663549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e685548-ea9a-40b5-b68a-812e88a9170d-config-data\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.687035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76p49\" (UniqueName: \"kubernetes.io/projected/8e685548-ea9a-40b5-b68a-812e88a9170d-kube-api-access-76p49\") pod \"barbican-api-79b49b68c6-7bkwh\" (UID: \"8e685548-ea9a-40b5-b68a-812e88a9170d\") " pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:23 crc kubenswrapper[4735]: I1008 14:25:23.982951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.006170 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f95665c55-m8pg7"] Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.022675 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cdc76b747-5kslq"] Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.113380 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d64898b78-7f66b"] Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.270548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79b49b68c6-7bkwh"] Oct 08 14:25:24 crc kubenswrapper[4735]: W1008 14:25:24.278159 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e685548_ea9a_40b5_b68a_812e88a9170d.slice/crio-8eafc589e96cae3558841ac40dccb74e1a5cfbadbb65fac3e646949b93abaa2b WatchSource:0}: Error finding container 8eafc589e96cae3558841ac40dccb74e1a5cfbadbb65fac3e646949b93abaa2b: Status 404 returned error can't find the container with id 8eafc589e96cae3558841ac40dccb74e1a5cfbadbb65fac3e646949b93abaa2b Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.895508 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:25:24 crc kubenswrapper[4735]: E1008 14:25:24.896073 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.950811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" event={"ID":"a6bcda23-f2e7-4534-a215-9f535010166b","Type":"ContainerStarted","Data":"08ffc4a7186deb5e0bbceb5f5a87937efee1934203b1626829465e34fcabfae6"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.950872 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" event={"ID":"a6bcda23-f2e7-4534-a215-9f535010166b","Type":"ContainerStarted","Data":"bad454f6fcfb1673ad89fb507848bc775edcaea2f61a375e6471b8157b48e7f3"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.953035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cdc76b747-5kslq" event={"ID":"059d447b-ce74-4cb3-9ed9-ddb4f50d983c","Type":"ContainerStarted","Data":"e342edcc65080d59f58e3cbdd65d49c7d0dd3b985c27eedcd4f7a013f1f5be81"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.953071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cdc76b747-5kslq" event={"ID":"059d447b-ce74-4cb3-9ed9-ddb4f50d983c","Type":"ContainerStarted","Data":"cc1b9580e25fb566759df7ae2bf4464cabc5085021c1c9c4921a86658410cbe2"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.954316 4735 generic.go:334] "Generic (PLEG): container finished" podID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerID="d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41" exitCode=0 Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.954376 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" event={"ID":"365fd8a7-37d2-4f32-868a-6bc1d735df20","Type":"ContainerDied","Data":"d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.954397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" event={"ID":"365fd8a7-37d2-4f32-868a-6bc1d735df20","Type":"ContainerStarted","Data":"36f1f5c7c7486bfd71cfb40229148a84e27fa8de2f9a69581ae11b295893d180"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.956365 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79b49b68c6-7bkwh" event={"ID":"8e685548-ea9a-40b5-b68a-812e88a9170d","Type":"ContainerStarted","Data":"be28c9d1fb98c03d7bab0e939f325363121cf181f0a994349d2e55bd8f1f14b0"} Oct 08 14:25:24 crc kubenswrapper[4735]: I1008 14:25:24.956390 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79b49b68c6-7bkwh" event={"ID":"8e685548-ea9a-40b5-b68a-812e88a9170d","Type":"ContainerStarted","Data":"8eafc589e96cae3558841ac40dccb74e1a5cfbadbb65fac3e646949b93abaa2b"} Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.966245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" event={"ID":"a6bcda23-f2e7-4534-a215-9f535010166b","Type":"ContainerStarted","Data":"d962d1f99998cd2c9a86b481706d47ea32d8e52edaeb2dde101dcf98d8980b51"} Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.968714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cdc76b747-5kslq" event={"ID":"059d447b-ce74-4cb3-9ed9-ddb4f50d983c","Type":"ContainerStarted","Data":"afc26e2080e00977f67ee5542b4f98b283033f6c388f7c72a7f8d24133b73c85"} Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.971163 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" event={"ID":"365fd8a7-37d2-4f32-868a-6bc1d735df20","Type":"ContainerStarted","Data":"ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722"} Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.974046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79b49b68c6-7bkwh" event={"ID":"8e685548-ea9a-40b5-b68a-812e88a9170d","Type":"ContainerStarted","Data":"8f3e762043b220739ee512ac8121930b84381e171333b20c813ba05a131a5271"} Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.974204 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.974253 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.986127 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d64898b78-7f66b" podStartSLOduration=2.986110815 podStartE2EDuration="2.986110815s" podCreationTimestamp="2025-10-08 14:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:25:25.981552469 +0000 UTC m=+5375.789857013" watchObservedRunningTime="2025-10-08 14:25:25.986110815 +0000 UTC m=+5375.794415359" Oct 08 14:25:25 crc kubenswrapper[4735]: I1008 14:25:25.999407 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" podStartSLOduration=2.999390435 podStartE2EDuration="2.999390435s" podCreationTimestamp="2025-10-08 14:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:25:25.99878066 +0000 UTC m=+5375.807085204" watchObservedRunningTime="2025-10-08 14:25:25.999390435 +0000 UTC m=+5375.807694979" Oct 08 14:25:26 crc kubenswrapper[4735]: I1008 14:25:26.042126 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cdc76b747-5kslq" podStartSLOduration=3.042108039 podStartE2EDuration="3.042108039s" podCreationTimestamp="2025-10-08 14:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:25:26.023932684 +0000 UTC m=+5375.832237228" watchObservedRunningTime="2025-10-08 14:25:26.042108039 +0000 UTC m=+5375.850412583" Oct 08 14:25:26 crc kubenswrapper[4735]: I1008 14:25:26.045382 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79b49b68c6-7bkwh" podStartSLOduration=3.045370343 podStartE2EDuration="3.045370343s" podCreationTimestamp="2025-10-08 14:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:25:26.0394 +0000 UTC m=+5375.847704574" watchObservedRunningTime="2025-10-08 14:25:26.045370343 +0000 UTC m=+5375.853674877" Oct 08 14:25:26 crc kubenswrapper[4735]: I1008 14:25:26.982407 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:33 crc kubenswrapper[4735]: I1008 14:25:33.568684 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:25:33 crc kubenswrapper[4735]: I1008 14:25:33.620779 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b44f7b875-5n8db"] Oct 08 14:25:33 crc kubenswrapper[4735]: I1008 14:25:33.621250 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerName="dnsmasq-dns" containerID="cri-o://62f718a1ae614b628fd716a84d297856ce4cdae04cd5c92fba6be937fc144b52" gracePeriod=10 Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.052552 4735 generic.go:334] "Generic (PLEG): container finished" podID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerID="62f718a1ae614b628fd716a84d297856ce4cdae04cd5c92fba6be937fc144b52" exitCode=0 Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.052604 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" event={"ID":"8b2e7a09-4a86-4f4b-b0d6-9611284faaca","Type":"ContainerDied","Data":"62f718a1ae614b628fd716a84d297856ce4cdae04cd5c92fba6be937fc144b52"} Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.052633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" event={"ID":"8b2e7a09-4a86-4f4b-b0d6-9611284faaca","Type":"ContainerDied","Data":"c395bbe3db1ddbae7fb9c90ba7788e55d0c2d0aac278708fa36153c12efb4067"} Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.052648 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c395bbe3db1ddbae7fb9c90ba7788e55d0c2d0aac278708fa36153c12efb4067" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.139875 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.257163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-nb\") pod \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.257295 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-sb\") pod \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.257434 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqqk7\" (UniqueName: \"kubernetes.io/projected/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-kube-api-access-jqqk7\") pod \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.257552 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-dns-svc\") pod \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.257641 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-config\") pod \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\" (UID: \"8b2e7a09-4a86-4f4b-b0d6-9611284faaca\") " Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.285974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-kube-api-access-jqqk7" (OuterVolumeSpecName: "kube-api-access-jqqk7") pod "8b2e7a09-4a86-4f4b-b0d6-9611284faaca" (UID: "8b2e7a09-4a86-4f4b-b0d6-9611284faaca"). InnerVolumeSpecName "kube-api-access-jqqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.360754 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqqk7\" (UniqueName: \"kubernetes.io/projected/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-kube-api-access-jqqk7\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.411344 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b2e7a09-4a86-4f4b-b0d6-9611284faaca" (UID: "8b2e7a09-4a86-4f4b-b0d6-9611284faaca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.412359 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b2e7a09-4a86-4f4b-b0d6-9611284faaca" (UID: "8b2e7a09-4a86-4f4b-b0d6-9611284faaca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.421815 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b2e7a09-4a86-4f4b-b0d6-9611284faaca" (UID: "8b2e7a09-4a86-4f4b-b0d6-9611284faaca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.436460 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-config" (OuterVolumeSpecName: "config") pod "8b2e7a09-4a86-4f4b-b0d6-9611284faaca" (UID: "8b2e7a09-4a86-4f4b-b0d6-9611284faaca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.462274 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.462317 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.462331 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:34 crc kubenswrapper[4735]: I1008 14:25:34.462346 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b2e7a09-4a86-4f4b-b0d6-9611284faaca-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:35 crc kubenswrapper[4735]: I1008 14:25:35.059876 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b44f7b875-5n8db" Oct 08 14:25:35 crc kubenswrapper[4735]: I1008 14:25:35.085917 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b44f7b875-5n8db"] Oct 08 14:25:35 crc kubenswrapper[4735]: I1008 14:25:35.094237 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b44f7b875-5n8db"] Oct 08 14:25:35 crc kubenswrapper[4735]: I1008 14:25:35.603990 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:35 crc kubenswrapper[4735]: I1008 14:25:35.817407 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79b49b68c6-7bkwh" Oct 08 14:25:35 crc kubenswrapper[4735]: I1008 14:25:35.895081 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:25:35 crc kubenswrapper[4735]: E1008 14:25:35.895410 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:25:36 crc kubenswrapper[4735]: I1008 14:25:36.904982 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" path="/var/lib/kubelet/pods/8b2e7a09-4a86-4f4b-b0d6-9611284faaca/volumes" Oct 08 14:25:47 crc kubenswrapper[4735]: I1008 14:25:47.895210 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:25:47 crc kubenswrapper[4735]: E1008 14:25:47.896547 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.569227 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7h8rr"] Oct 08 14:25:49 crc kubenswrapper[4735]: E1008 14:25:49.569822 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerName="init" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.569835 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerName="init" Oct 08 14:25:49 crc kubenswrapper[4735]: E1008 14:25:49.569859 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerName="dnsmasq-dns" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.569864 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerName="dnsmasq-dns" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.570016 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2e7a09-4a86-4f4b-b0d6-9611284faaca" containerName="dnsmasq-dns" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.570580 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.577223 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7h8rr"] Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.638410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65m7\" (UniqueName: \"kubernetes.io/projected/e95023bc-dd24-4f91-83d1-2e871d8c7ef3-kube-api-access-c65m7\") pod \"neutron-db-create-7h8rr\" (UID: \"e95023bc-dd24-4f91-83d1-2e871d8c7ef3\") " pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.740312 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65m7\" (UniqueName: \"kubernetes.io/projected/e95023bc-dd24-4f91-83d1-2e871d8c7ef3-kube-api-access-c65m7\") pod \"neutron-db-create-7h8rr\" (UID: \"e95023bc-dd24-4f91-83d1-2e871d8c7ef3\") " pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.773121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65m7\" (UniqueName: \"kubernetes.io/projected/e95023bc-dd24-4f91-83d1-2e871d8c7ef3-kube-api-access-c65m7\") pod \"neutron-db-create-7h8rr\" (UID: \"e95023bc-dd24-4f91-83d1-2e871d8c7ef3\") " pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:49 crc kubenswrapper[4735]: I1008 14:25:49.900296 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:50 crc kubenswrapper[4735]: I1008 14:25:50.338851 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7h8rr"] Oct 08 14:25:51 crc kubenswrapper[4735]: I1008 14:25:51.200902 4735 generic.go:334] "Generic (PLEG): container finished" podID="e95023bc-dd24-4f91-83d1-2e871d8c7ef3" containerID="75526589dd25e9e9ec4baed21ad6d1a3e1107cc5edb87e014d09c97ee157b319" exitCode=0 Oct 08 14:25:51 crc kubenswrapper[4735]: I1008 14:25:51.200978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7h8rr" event={"ID":"e95023bc-dd24-4f91-83d1-2e871d8c7ef3","Type":"ContainerDied","Data":"75526589dd25e9e9ec4baed21ad6d1a3e1107cc5edb87e014d09c97ee157b319"} Oct 08 14:25:51 crc kubenswrapper[4735]: I1008 14:25:51.201277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7h8rr" event={"ID":"e95023bc-dd24-4f91-83d1-2e871d8c7ef3","Type":"ContainerStarted","Data":"fc6bf4668ed963691273e1ba9da53b814115fe36436c4015d9be38e684de2502"} Oct 08 14:25:52 crc kubenswrapper[4735]: I1008 14:25:52.600148 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:52 crc kubenswrapper[4735]: I1008 14:25:52.694699 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65m7\" (UniqueName: \"kubernetes.io/projected/e95023bc-dd24-4f91-83d1-2e871d8c7ef3-kube-api-access-c65m7\") pod \"e95023bc-dd24-4f91-83d1-2e871d8c7ef3\" (UID: \"e95023bc-dd24-4f91-83d1-2e871d8c7ef3\") " Oct 08 14:25:52 crc kubenswrapper[4735]: I1008 14:25:52.700800 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95023bc-dd24-4f91-83d1-2e871d8c7ef3-kube-api-access-c65m7" (OuterVolumeSpecName: "kube-api-access-c65m7") pod "e95023bc-dd24-4f91-83d1-2e871d8c7ef3" (UID: "e95023bc-dd24-4f91-83d1-2e871d8c7ef3"). InnerVolumeSpecName "kube-api-access-c65m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:25:52 crc kubenswrapper[4735]: I1008 14:25:52.797087 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65m7\" (UniqueName: \"kubernetes.io/projected/e95023bc-dd24-4f91-83d1-2e871d8c7ef3-kube-api-access-c65m7\") on node \"crc\" DevicePath \"\"" Oct 08 14:25:53 crc kubenswrapper[4735]: I1008 14:25:53.218700 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7h8rr" event={"ID":"e95023bc-dd24-4f91-83d1-2e871d8c7ef3","Type":"ContainerDied","Data":"fc6bf4668ed963691273e1ba9da53b814115fe36436c4015d9be38e684de2502"} Oct 08 14:25:53 crc kubenswrapper[4735]: I1008 14:25:53.218741 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc6bf4668ed963691273e1ba9da53b814115fe36436c4015d9be38e684de2502" Oct 08 14:25:53 crc kubenswrapper[4735]: I1008 14:25:53.218778 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7h8rr" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.656061 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-41fb-account-create-d42cb"] Oct 08 14:25:59 crc kubenswrapper[4735]: E1008 14:25:59.657342 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95023bc-dd24-4f91-83d1-2e871d8c7ef3" containerName="mariadb-database-create" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.657441 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95023bc-dd24-4f91-83d1-2e871d8c7ef3" containerName="mariadb-database-create" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.657698 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95023bc-dd24-4f91-83d1-2e871d8c7ef3" containerName="mariadb-database-create" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.658393 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.660897 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.663900 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-41fb-account-create-d42cb"] Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.715341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfk7q\" (UniqueName: \"kubernetes.io/projected/f4338f1c-35c7-401d-b01c-891688326e5e-kube-api-access-xfk7q\") pod \"neutron-41fb-account-create-d42cb\" (UID: \"f4338f1c-35c7-401d-b01c-891688326e5e\") " pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.817009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfk7q\" (UniqueName: \"kubernetes.io/projected/f4338f1c-35c7-401d-b01c-891688326e5e-kube-api-access-xfk7q\") pod \"neutron-41fb-account-create-d42cb\" (UID: \"f4338f1c-35c7-401d-b01c-891688326e5e\") " pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.838328 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfk7q\" (UniqueName: \"kubernetes.io/projected/f4338f1c-35c7-401d-b01c-891688326e5e-kube-api-access-xfk7q\") pod \"neutron-41fb-account-create-d42cb\" (UID: \"f4338f1c-35c7-401d-b01c-891688326e5e\") " pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:25:59 crc kubenswrapper[4735]: I1008 14:25:59.980503 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:26:00 crc kubenswrapper[4735]: I1008 14:26:00.380276 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-41fb-account-create-d42cb"] Oct 08 14:26:01 crc kubenswrapper[4735]: I1008 14:26:01.285158 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4338f1c-35c7-401d-b01c-891688326e5e" containerID="fe621d50861b9567c7bec747ef591bca2bee4db8ac640a3d80b99d26346164b4" exitCode=0 Oct 08 14:26:01 crc kubenswrapper[4735]: I1008 14:26:01.285326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-41fb-account-create-d42cb" event={"ID":"f4338f1c-35c7-401d-b01c-891688326e5e","Type":"ContainerDied","Data":"fe621d50861b9567c7bec747ef591bca2bee4db8ac640a3d80b99d26346164b4"} Oct 08 14:26:01 crc kubenswrapper[4735]: I1008 14:26:01.285469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-41fb-account-create-d42cb" event={"ID":"f4338f1c-35c7-401d-b01c-891688326e5e","Type":"ContainerStarted","Data":"37fb14f169408da6b3bc5d35ceaa553b167be543d3dd66af440b080039d66797"} Oct 08 14:26:02 crc kubenswrapper[4735]: I1008 14:26:02.604756 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:26:02 crc kubenswrapper[4735]: I1008 14:26:02.663056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfk7q\" (UniqueName: \"kubernetes.io/projected/f4338f1c-35c7-401d-b01c-891688326e5e-kube-api-access-xfk7q\") pod \"f4338f1c-35c7-401d-b01c-891688326e5e\" (UID: \"f4338f1c-35c7-401d-b01c-891688326e5e\") " Oct 08 14:26:02 crc kubenswrapper[4735]: I1008 14:26:02.667936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4338f1c-35c7-401d-b01c-891688326e5e-kube-api-access-xfk7q" (OuterVolumeSpecName: "kube-api-access-xfk7q") pod "f4338f1c-35c7-401d-b01c-891688326e5e" (UID: "f4338f1c-35c7-401d-b01c-891688326e5e"). InnerVolumeSpecName "kube-api-access-xfk7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:26:02 crc kubenswrapper[4735]: I1008 14:26:02.764896 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfk7q\" (UniqueName: \"kubernetes.io/projected/f4338f1c-35c7-401d-b01c-891688326e5e-kube-api-access-xfk7q\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:02 crc kubenswrapper[4735]: I1008 14:26:02.895840 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:26:02 crc kubenswrapper[4735]: E1008 14:26:02.896057 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:26:03 crc kubenswrapper[4735]: I1008 14:26:03.307575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-41fb-account-create-d42cb" event={"ID":"f4338f1c-35c7-401d-b01c-891688326e5e","Type":"ContainerDied","Data":"37fb14f169408da6b3bc5d35ceaa553b167be543d3dd66af440b080039d66797"} Oct 08 14:26:03 crc kubenswrapper[4735]: I1008 14:26:03.307618 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fb14f169408da6b3bc5d35ceaa553b167be543d3dd66af440b080039d66797" Oct 08 14:26:03 crc kubenswrapper[4735]: I1008 14:26:03.307725 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-41fb-account-create-d42cb" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.817667 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vtn5d"] Oct 08 14:26:04 crc kubenswrapper[4735]: E1008 14:26:04.818247 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4338f1c-35c7-401d-b01c-891688326e5e" containerName="mariadb-account-create" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.818259 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4338f1c-35c7-401d-b01c-891688326e5e" containerName="mariadb-account-create" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.818443 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4338f1c-35c7-401d-b01c-891688326e5e" containerName="mariadb-account-create" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.818976 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.825006 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vtn5d"] Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.825421 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.825634 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.825978 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vmhlg" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.901652 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-combined-ca-bundle\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.901901 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnqt8\" (UniqueName: \"kubernetes.io/projected/417cb1c4-d696-47ca-bab5-8ade27c31581-kube-api-access-gnqt8\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:04 crc kubenswrapper[4735]: I1008 14:26:04.901997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-config\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.003317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-combined-ca-bundle\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.003367 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnqt8\" (UniqueName: \"kubernetes.io/projected/417cb1c4-d696-47ca-bab5-8ade27c31581-kube-api-access-gnqt8\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.003416 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-config\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.009113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-combined-ca-bundle\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.009269 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-config\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.021991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnqt8\" (UniqueName: \"kubernetes.io/projected/417cb1c4-d696-47ca-bab5-8ade27c31581-kube-api-access-gnqt8\") pod \"neutron-db-sync-vtn5d\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.138887 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:05 crc kubenswrapper[4735]: I1008 14:26:05.500308 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vtn5d"] Oct 08 14:26:06 crc kubenswrapper[4735]: I1008 14:26:06.353737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtn5d" event={"ID":"417cb1c4-d696-47ca-bab5-8ade27c31581","Type":"ContainerStarted","Data":"91d2d5ec822b4469af663d2189584e590eba0141f23bbed98b64989e890edf48"} Oct 08 14:26:06 crc kubenswrapper[4735]: I1008 14:26:06.354064 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtn5d" event={"ID":"417cb1c4-d696-47ca-bab5-8ade27c31581","Type":"ContainerStarted","Data":"c00c53ae23f8fc3990d023a8ea04429da4f9c15f42b73304bb240408321a7a81"} Oct 08 14:26:06 crc kubenswrapper[4735]: I1008 14:26:06.370366 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vtn5d" podStartSLOduration=2.370340524 podStartE2EDuration="2.370340524s" podCreationTimestamp="2025-10-08 14:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:26:06.369685948 +0000 UTC m=+5416.177990502" watchObservedRunningTime="2025-10-08 14:26:06.370340524 +0000 UTC m=+5416.178645088" Oct 08 14:26:10 crc kubenswrapper[4735]: I1008 14:26:10.397862 4735 generic.go:334] "Generic (PLEG): container finished" podID="417cb1c4-d696-47ca-bab5-8ade27c31581" containerID="91d2d5ec822b4469af663d2189584e590eba0141f23bbed98b64989e890edf48" exitCode=0 Oct 08 14:26:10 crc kubenswrapper[4735]: I1008 14:26:10.397942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtn5d" event={"ID":"417cb1c4-d696-47ca-bab5-8ade27c31581","Type":"ContainerDied","Data":"91d2d5ec822b4469af663d2189584e590eba0141f23bbed98b64989e890edf48"} Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.695825 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.808154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-config\") pod \"417cb1c4-d696-47ca-bab5-8ade27c31581\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.808264 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnqt8\" (UniqueName: \"kubernetes.io/projected/417cb1c4-d696-47ca-bab5-8ade27c31581-kube-api-access-gnqt8\") pod \"417cb1c4-d696-47ca-bab5-8ade27c31581\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.808392 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-combined-ca-bundle\") pod \"417cb1c4-d696-47ca-bab5-8ade27c31581\" (UID: \"417cb1c4-d696-47ca-bab5-8ade27c31581\") " Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.814737 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417cb1c4-d696-47ca-bab5-8ade27c31581-kube-api-access-gnqt8" (OuterVolumeSpecName: "kube-api-access-gnqt8") pod "417cb1c4-d696-47ca-bab5-8ade27c31581" (UID: "417cb1c4-d696-47ca-bab5-8ade27c31581"). InnerVolumeSpecName "kube-api-access-gnqt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.834037 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "417cb1c4-d696-47ca-bab5-8ade27c31581" (UID: "417cb1c4-d696-47ca-bab5-8ade27c31581"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.836945 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-config" (OuterVolumeSpecName: "config") pod "417cb1c4-d696-47ca-bab5-8ade27c31581" (UID: "417cb1c4-d696-47ca-bab5-8ade27c31581"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.910661 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.910696 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/417cb1c4-d696-47ca-bab5-8ade27c31581-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:11 crc kubenswrapper[4735]: I1008 14:26:11.910706 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnqt8\" (UniqueName: \"kubernetes.io/projected/417cb1c4-d696-47ca-bab5-8ade27c31581-kube-api-access-gnqt8\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.423670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtn5d" event={"ID":"417cb1c4-d696-47ca-bab5-8ade27c31581","Type":"ContainerDied","Data":"c00c53ae23f8fc3990d023a8ea04429da4f9c15f42b73304bb240408321a7a81"} Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.423715 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00c53ae23f8fc3990d023a8ea04429da4f9c15f42b73304bb240408321a7a81" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.424134 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtn5d" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.594751 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-766cb6844c-nfq7s"] Oct 08 14:26:12 crc kubenswrapper[4735]: E1008 14:26:12.595190 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417cb1c4-d696-47ca-bab5-8ade27c31581" containerName="neutron-db-sync" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.595212 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="417cb1c4-d696-47ca-bab5-8ade27c31581" containerName="neutron-db-sync" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.595413 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="417cb1c4-d696-47ca-bab5-8ade27c31581" containerName="neutron-db-sync" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.596470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.621674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfswt\" (UniqueName: \"kubernetes.io/projected/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-kube-api-access-hfswt\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.621740 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-config\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.621769 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-sb\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.621796 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-dns-svc\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.621890 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-nb\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.627568 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-766cb6844c-nfq7s"] Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.725348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-nb\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.725422 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfswt\" (UniqueName: \"kubernetes.io/projected/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-kube-api-access-hfswt\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.725470 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-config\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.725509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-sb\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.725534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-dns-svc\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.726179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-nb\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.726529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-dns-svc\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.726717 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-config\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.727073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-sb\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.750614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfswt\" (UniqueName: \"kubernetes.io/projected/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-kube-api-access-hfswt\") pod \"dnsmasq-dns-766cb6844c-nfq7s\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.779707 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58875664cc-zmhz6"] Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.790718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.800106 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.800220 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.800297 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vmhlg" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.807105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58875664cc-zmhz6"] Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.839461 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-httpd-config\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.839569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7v4\" (UniqueName: \"kubernetes.io/projected/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-kube-api-access-mq7v4\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.839588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-combined-ca-bundle\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.839613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-config\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.934652 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.941660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-combined-ca-bundle\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.941717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7v4\" (UniqueName: \"kubernetes.io/projected/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-kube-api-access-mq7v4\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.941764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-config\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.941866 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-httpd-config\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.955359 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-httpd-config\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.955584 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-combined-ca-bundle\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.955928 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-config\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:12 crc kubenswrapper[4735]: I1008 14:26:12.958387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7v4\" (UniqueName: \"kubernetes.io/projected/b2e6f3a0-af7d-41a8-b515-df3df9059eb9-kube-api-access-mq7v4\") pod \"neutron-58875664cc-zmhz6\" (UID: \"b2e6f3a0-af7d-41a8-b515-df3df9059eb9\") " pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:13 crc kubenswrapper[4735]: I1008 14:26:13.159456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:13 crc kubenswrapper[4735]: I1008 14:26:13.393231 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-766cb6844c-nfq7s"] Oct 08 14:26:13 crc kubenswrapper[4735]: I1008 14:26:13.447631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" event={"ID":"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb","Type":"ContainerStarted","Data":"f2b548f20afb8e5100e81ff5970c0976796340f6517bee02f962be9b34591ccb"} Oct 08 14:26:13 crc kubenswrapper[4735]: I1008 14:26:13.498926 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58875664cc-zmhz6"] Oct 08 14:26:13 crc kubenswrapper[4735]: W1008 14:26:13.504272 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2e6f3a0_af7d_41a8_b515_df3df9059eb9.slice/crio-2a519259cdf820ea2b71736ba898a9d7e21b8b00714cd2d6788fd73a68b9ab3e WatchSource:0}: Error finding container 2a519259cdf820ea2b71736ba898a9d7e21b8b00714cd2d6788fd73a68b9ab3e: Status 404 returned error can't find the container with id 2a519259cdf820ea2b71736ba898a9d7e21b8b00714cd2d6788fd73a68b9ab3e Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.458772 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58875664cc-zmhz6" event={"ID":"b2e6f3a0-af7d-41a8-b515-df3df9059eb9","Type":"ContainerStarted","Data":"82e3c2318a70cbf8ea424be3b25273a60b9ad352caba505ee55cd0f6243e4398"} Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.459434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58875664cc-zmhz6" event={"ID":"b2e6f3a0-af7d-41a8-b515-df3df9059eb9","Type":"ContainerStarted","Data":"1525b729ccd5b38c5973f0523e1bae2e456a0ecaa4a1b8f59ea64f317ae3173e"} Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.459459 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58875664cc-zmhz6" event={"ID":"b2e6f3a0-af7d-41a8-b515-df3df9059eb9","Type":"ContainerStarted","Data":"2a519259cdf820ea2b71736ba898a9d7e21b8b00714cd2d6788fd73a68b9ab3e"} Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.460599 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.463795 4735 generic.go:334] "Generic (PLEG): container finished" podID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerID="0ddf10748c510c120c6a423ae3c34f4176dbd927dc9fdc6daf10472c39528936" exitCode=0 Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.463835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" event={"ID":"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb","Type":"ContainerDied","Data":"0ddf10748c510c120c6a423ae3c34f4176dbd927dc9fdc6daf10472c39528936"} Oct 08 14:26:14 crc kubenswrapper[4735]: I1008 14:26:14.486106 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58875664cc-zmhz6" podStartSLOduration=2.486088159 podStartE2EDuration="2.486088159s" podCreationTimestamp="2025-10-08 14:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:26:14.482510627 +0000 UTC m=+5424.290815171" watchObservedRunningTime="2025-10-08 14:26:14.486088159 +0000 UTC m=+5424.294392703" Oct 08 14:26:15 crc kubenswrapper[4735]: I1008 14:26:15.474150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" event={"ID":"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb","Type":"ContainerStarted","Data":"c635fb16ccb4f5ac20435e79e91b155899fa9ba5142812c4e69c98d5d833a2d8"} Oct 08 14:26:15 crc kubenswrapper[4735]: I1008 14:26:15.474618 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:15 crc kubenswrapper[4735]: I1008 14:26:15.495655 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" podStartSLOduration=3.495636142 podStartE2EDuration="3.495636142s" podCreationTimestamp="2025-10-08 14:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:26:15.494184034 +0000 UTC m=+5425.302488598" watchObservedRunningTime="2025-10-08 14:26:15.495636142 +0000 UTC m=+5425.303940686" Oct 08 14:26:15 crc kubenswrapper[4735]: I1008 14:26:15.894983 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:26:15 crc kubenswrapper[4735]: E1008 14:26:15.895248 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:26:22 crc kubenswrapper[4735]: I1008 14:26:22.936415 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:26:22 crc kubenswrapper[4735]: I1008 14:26:22.996011 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f95665c55-m8pg7"] Oct 08 14:26:22 crc kubenswrapper[4735]: I1008 14:26:22.996640 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerName="dnsmasq-dns" containerID="cri-o://ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722" gracePeriod=10 Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.534922 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.550590 4735 generic.go:334] "Generic (PLEG): container finished" podID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerID="ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722" exitCode=0 Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.550631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" event={"ID":"365fd8a7-37d2-4f32-868a-6bc1d735df20","Type":"ContainerDied","Data":"ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722"} Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.550655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" event={"ID":"365fd8a7-37d2-4f32-868a-6bc1d735df20","Type":"ContainerDied","Data":"36f1f5c7c7486bfd71cfb40229148a84e27fa8de2f9a69581ae11b295893d180"} Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.550671 4735 scope.go:117] "RemoveContainer" containerID="ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.550688 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f95665c55-m8pg7" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.583376 4735 scope.go:117] "RemoveContainer" containerID="d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.612327 4735 scope.go:117] "RemoveContainer" containerID="ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722" Oct 08 14:26:23 crc kubenswrapper[4735]: E1008 14:26:23.612906 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722\": container with ID starting with ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722 not found: ID does not exist" containerID="ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.612936 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722"} err="failed to get container status \"ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722\": rpc error: code = NotFound desc = could not find container \"ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722\": container with ID starting with ec258df7b22d029ec95f9ae5b8d199fb1326194714a86f5264934e6419498722 not found: ID does not exist" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.612958 4735 scope.go:117] "RemoveContainer" containerID="d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41" Oct 08 14:26:23 crc kubenswrapper[4735]: E1008 14:26:23.613868 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41\": container with ID starting with d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41 not found: ID does not exist" containerID="d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.613923 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41"} err="failed to get container status \"d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41\": rpc error: code = NotFound desc = could not find container \"d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41\": container with ID starting with d99397effedf60768662e813422049a0d93bcb6996e42da4d3fcf97895098c41 not found: ID does not exist" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.657587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-config\") pod \"365fd8a7-37d2-4f32-868a-6bc1d735df20\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.657654 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfkzw\" (UniqueName: \"kubernetes.io/projected/365fd8a7-37d2-4f32-868a-6bc1d735df20-kube-api-access-nfkzw\") pod \"365fd8a7-37d2-4f32-868a-6bc1d735df20\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.657713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-dns-svc\") pod \"365fd8a7-37d2-4f32-868a-6bc1d735df20\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.657730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-nb\") pod \"365fd8a7-37d2-4f32-868a-6bc1d735df20\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.657858 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-sb\") pod \"365fd8a7-37d2-4f32-868a-6bc1d735df20\" (UID: \"365fd8a7-37d2-4f32-868a-6bc1d735df20\") " Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.663244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365fd8a7-37d2-4f32-868a-6bc1d735df20-kube-api-access-nfkzw" (OuterVolumeSpecName: "kube-api-access-nfkzw") pod "365fd8a7-37d2-4f32-868a-6bc1d735df20" (UID: "365fd8a7-37d2-4f32-868a-6bc1d735df20"). InnerVolumeSpecName "kube-api-access-nfkzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.701110 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-config" (OuterVolumeSpecName: "config") pod "365fd8a7-37d2-4f32-868a-6bc1d735df20" (UID: "365fd8a7-37d2-4f32-868a-6bc1d735df20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.703908 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "365fd8a7-37d2-4f32-868a-6bc1d735df20" (UID: "365fd8a7-37d2-4f32-868a-6bc1d735df20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.706948 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "365fd8a7-37d2-4f32-868a-6bc1d735df20" (UID: "365fd8a7-37d2-4f32-868a-6bc1d735df20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.710728 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "365fd8a7-37d2-4f32-868a-6bc1d735df20" (UID: "365fd8a7-37d2-4f32-868a-6bc1d735df20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.760498 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.760555 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfkzw\" (UniqueName: \"kubernetes.io/projected/365fd8a7-37d2-4f32-868a-6bc1d735df20-kube-api-access-nfkzw\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.760570 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.760583 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.760594 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/365fd8a7-37d2-4f32-868a-6bc1d735df20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.886273 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f95665c55-m8pg7"] Oct 08 14:26:23 crc kubenswrapper[4735]: I1008 14:26:23.892933 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f95665c55-m8pg7"] Oct 08 14:26:24 crc kubenswrapper[4735]: I1008 14:26:24.906213 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" path="/var/lib/kubelet/pods/365fd8a7-37d2-4f32-868a-6bc1d735df20/volumes" Oct 08 14:26:27 crc kubenswrapper[4735]: I1008 14:26:27.895413 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:26:27 crc kubenswrapper[4735]: E1008 14:26:27.896290 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:26:40 crc kubenswrapper[4735]: I1008 14:26:40.899550 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:26:40 crc kubenswrapper[4735]: E1008 14:26:40.900305 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:26:43 crc kubenswrapper[4735]: I1008 14:26:43.169180 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58875664cc-zmhz6" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.704753 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-xxsxt"] Oct 08 14:26:50 crc kubenswrapper[4735]: E1008 14:26:50.705602 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerName="dnsmasq-dns" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.705617 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerName="dnsmasq-dns" Oct 08 14:26:50 crc kubenswrapper[4735]: E1008 14:26:50.705636 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerName="init" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.705641 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerName="init" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.705803 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="365fd8a7-37d2-4f32-868a-6bc1d735df20" containerName="dnsmasq-dns" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.706356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.717916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrg6\" (UniqueName: \"kubernetes.io/projected/63b01c62-6479-4841-b2ce-34272ef86e2f-kube-api-access-jtrg6\") pod \"glance-db-create-xxsxt\" (UID: \"63b01c62-6479-4841-b2ce-34272ef86e2f\") " pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.730300 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xxsxt"] Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.819152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrg6\" (UniqueName: \"kubernetes.io/projected/63b01c62-6479-4841-b2ce-34272ef86e2f-kube-api-access-jtrg6\") pod \"glance-db-create-xxsxt\" (UID: \"63b01c62-6479-4841-b2ce-34272ef86e2f\") " pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:50 crc kubenswrapper[4735]: I1008 14:26:50.841186 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrg6\" (UniqueName: \"kubernetes.io/projected/63b01c62-6479-4841-b2ce-34272ef86e2f-kube-api-access-jtrg6\") pod \"glance-db-create-xxsxt\" (UID: \"63b01c62-6479-4841-b2ce-34272ef86e2f\") " pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:51 crc kubenswrapper[4735]: I1008 14:26:51.032681 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:51 crc kubenswrapper[4735]: I1008 14:26:51.457954 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xxsxt"] Oct 08 14:26:51 crc kubenswrapper[4735]: I1008 14:26:51.801418 4735 generic.go:334] "Generic (PLEG): container finished" podID="63b01c62-6479-4841-b2ce-34272ef86e2f" containerID="1247abbe2fd70efaba12ea959fe092fe23cf79aea99b71a0f7bec6382b9e28b2" exitCode=0 Oct 08 14:26:51 crc kubenswrapper[4735]: I1008 14:26:51.801470 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xxsxt" event={"ID":"63b01c62-6479-4841-b2ce-34272ef86e2f","Type":"ContainerDied","Data":"1247abbe2fd70efaba12ea959fe092fe23cf79aea99b71a0f7bec6382b9e28b2"} Oct 08 14:26:51 crc kubenswrapper[4735]: I1008 14:26:51.801518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xxsxt" event={"ID":"63b01c62-6479-4841-b2ce-34272ef86e2f","Type":"ContainerStarted","Data":"349a1cc846be4674382329ec6cfe8dd55db9eef5028fed242ae06ab1acee7a4e"} Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.106672 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.267325 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtrg6\" (UniqueName: \"kubernetes.io/projected/63b01c62-6479-4841-b2ce-34272ef86e2f-kube-api-access-jtrg6\") pod \"63b01c62-6479-4841-b2ce-34272ef86e2f\" (UID: \"63b01c62-6479-4841-b2ce-34272ef86e2f\") " Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.272717 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b01c62-6479-4841-b2ce-34272ef86e2f-kube-api-access-jtrg6" (OuterVolumeSpecName: "kube-api-access-jtrg6") pod "63b01c62-6479-4841-b2ce-34272ef86e2f" (UID: "63b01c62-6479-4841-b2ce-34272ef86e2f"). InnerVolumeSpecName "kube-api-access-jtrg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.369003 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtrg6\" (UniqueName: \"kubernetes.io/projected/63b01c62-6479-4841-b2ce-34272ef86e2f-kube-api-access-jtrg6\") on node \"crc\" DevicePath \"\"" Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.819993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xxsxt" event={"ID":"63b01c62-6479-4841-b2ce-34272ef86e2f","Type":"ContainerDied","Data":"349a1cc846be4674382329ec6cfe8dd55db9eef5028fed242ae06ab1acee7a4e"} Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.820072 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="349a1cc846be4674382329ec6cfe8dd55db9eef5028fed242ae06ab1acee7a4e" Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.820166 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xxsxt" Oct 08 14:26:53 crc kubenswrapper[4735]: I1008 14:26:53.894851 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:26:53 crc kubenswrapper[4735]: E1008 14:26:53.895154 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.750184 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f91b-account-create-qqfhb"] Oct 08 14:27:00 crc kubenswrapper[4735]: E1008 14:27:00.752698 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b01c62-6479-4841-b2ce-34272ef86e2f" containerName="mariadb-database-create" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.752735 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b01c62-6479-4841-b2ce-34272ef86e2f" containerName="mariadb-database-create" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.752965 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b01c62-6479-4841-b2ce-34272ef86e2f" containerName="mariadb-database-create" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.753688 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.756953 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.761446 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f91b-account-create-qqfhb"] Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.890044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcb9t\" (UniqueName: \"kubernetes.io/projected/18797497-e1df-4609-989b-b25a50cef42d-kube-api-access-tcb9t\") pod \"glance-f91b-account-create-qqfhb\" (UID: \"18797497-e1df-4609-989b-b25a50cef42d\") " pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:00 crc kubenswrapper[4735]: I1008 14:27:00.991140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcb9t\" (UniqueName: \"kubernetes.io/projected/18797497-e1df-4609-989b-b25a50cef42d-kube-api-access-tcb9t\") pod \"glance-f91b-account-create-qqfhb\" (UID: \"18797497-e1df-4609-989b-b25a50cef42d\") " pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:01 crc kubenswrapper[4735]: I1008 14:27:01.012238 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcb9t\" (UniqueName: \"kubernetes.io/projected/18797497-e1df-4609-989b-b25a50cef42d-kube-api-access-tcb9t\") pod \"glance-f91b-account-create-qqfhb\" (UID: \"18797497-e1df-4609-989b-b25a50cef42d\") " pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:01 crc kubenswrapper[4735]: I1008 14:27:01.075792 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:01 crc kubenswrapper[4735]: I1008 14:27:01.540308 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f91b-account-create-qqfhb"] Oct 08 14:27:01 crc kubenswrapper[4735]: I1008 14:27:01.884550 4735 generic.go:334] "Generic (PLEG): container finished" podID="18797497-e1df-4609-989b-b25a50cef42d" containerID="d57af3ab4184733cf538abae0a95240cdf5a30452f8bff100d7dc1586bf11b2d" exitCode=0 Oct 08 14:27:01 crc kubenswrapper[4735]: I1008 14:27:01.886712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f91b-account-create-qqfhb" event={"ID":"18797497-e1df-4609-989b-b25a50cef42d","Type":"ContainerDied","Data":"d57af3ab4184733cf538abae0a95240cdf5a30452f8bff100d7dc1586bf11b2d"} Oct 08 14:27:01 crc kubenswrapper[4735]: I1008 14:27:01.886866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f91b-account-create-qqfhb" event={"ID":"18797497-e1df-4609-989b-b25a50cef42d","Type":"ContainerStarted","Data":"f2cc80cdd83a2c89f2fc030cc3f49c9af92d0d7fd43290752b66bbb75bf41c95"} Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.245557 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.431276 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcb9t\" (UniqueName: \"kubernetes.io/projected/18797497-e1df-4609-989b-b25a50cef42d-kube-api-access-tcb9t\") pod \"18797497-e1df-4609-989b-b25a50cef42d\" (UID: \"18797497-e1df-4609-989b-b25a50cef42d\") " Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.436718 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18797497-e1df-4609-989b-b25a50cef42d-kube-api-access-tcb9t" (OuterVolumeSpecName: "kube-api-access-tcb9t") pod "18797497-e1df-4609-989b-b25a50cef42d" (UID: "18797497-e1df-4609-989b-b25a50cef42d"). InnerVolumeSpecName "kube-api-access-tcb9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.533586 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcb9t\" (UniqueName: \"kubernetes.io/projected/18797497-e1df-4609-989b-b25a50cef42d-kube-api-access-tcb9t\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.902749 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f91b-account-create-qqfhb" event={"ID":"18797497-e1df-4609-989b-b25a50cef42d","Type":"ContainerDied","Data":"f2cc80cdd83a2c89f2fc030cc3f49c9af92d0d7fd43290752b66bbb75bf41c95"} Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.902808 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2cc80cdd83a2c89f2fc030cc3f49c9af92d0d7fd43290752b66bbb75bf41c95" Oct 08 14:27:03 crc kubenswrapper[4735]: I1008 14:27:03.902816 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f91b-account-create-qqfhb" Oct 08 14:27:04 crc kubenswrapper[4735]: I1008 14:27:04.895003 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:27:04 crc kubenswrapper[4735]: E1008 14:27:04.895321 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.851082 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4kftk"] Oct 08 14:27:05 crc kubenswrapper[4735]: E1008 14:27:05.851767 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18797497-e1df-4609-989b-b25a50cef42d" containerName="mariadb-account-create" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.851783 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="18797497-e1df-4609-989b-b25a50cef42d" containerName="mariadb-account-create" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.852015 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="18797497-e1df-4609-989b-b25a50cef42d" containerName="mariadb-account-create" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.852664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.858556 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-796xw" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.858828 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.882282 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4kftk"] Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.977170 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-config-data\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.978081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-db-sync-config-data\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.978163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-combined-ca-bundle\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:05 crc kubenswrapper[4735]: I1008 14:27:05.978197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmwl8\" (UniqueName: \"kubernetes.io/projected/34a77a37-d98c-41ee-aa9e-e1b814507e2b-kube-api-access-tmwl8\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.079714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-db-sync-config-data\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.079764 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-combined-ca-bundle\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.079806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmwl8\" (UniqueName: \"kubernetes.io/projected/34a77a37-d98c-41ee-aa9e-e1b814507e2b-kube-api-access-tmwl8\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.079909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-config-data\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.084160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-db-sync-config-data\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.084596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-combined-ca-bundle\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.084607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-config-data\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.102621 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmwl8\" (UniqueName: \"kubernetes.io/projected/34a77a37-d98c-41ee-aa9e-e1b814507e2b-kube-api-access-tmwl8\") pod \"glance-db-sync-4kftk\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.177457 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.684382 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4kftk"] Oct 08 14:27:06 crc kubenswrapper[4735]: I1008 14:27:06.930535 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4kftk" event={"ID":"34a77a37-d98c-41ee-aa9e-e1b814507e2b","Type":"ContainerStarted","Data":"43d0e3c4de3379b797c0130486b4a9a3e8a98dbc5d8473959cc23e36047c9c38"} Oct 08 14:27:07 crc kubenswrapper[4735]: I1008 14:27:07.940717 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4kftk" event={"ID":"34a77a37-d98c-41ee-aa9e-e1b814507e2b","Type":"ContainerStarted","Data":"41894952995a3b3c0a0eca17cf2bc077920971ed47b354f2041ec9b9da12a012"} Oct 08 14:27:07 crc kubenswrapper[4735]: I1008 14:27:07.961359 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4kftk" podStartSLOduration=2.961339496 podStartE2EDuration="2.961339496s" podCreationTimestamp="2025-10-08 14:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:07.957228141 +0000 UTC m=+5477.765532705" watchObservedRunningTime="2025-10-08 14:27:07.961339496 +0000 UTC m=+5477.769644040" Oct 08 14:27:08 crc kubenswrapper[4735]: I1008 14:27:08.263730 4735 scope.go:117] "RemoveContainer" containerID="26a132e417e043b5b76ae625600e75cad8c71093239df4d949bd0193c5a0185b" Oct 08 14:27:08 crc kubenswrapper[4735]: I1008 14:27:08.287958 4735 scope.go:117] "RemoveContainer" containerID="c2c9e03cf806c3171d1a46ab435a0a11b10fadadf815b5db33a061d6fa30f581" Oct 08 14:27:10 crc kubenswrapper[4735]: E1008 14:27:10.508278 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a77a37_d98c_41ee_aa9e_e1b814507e2b.slice/crio-conmon-41894952995a3b3c0a0eca17cf2bc077920971ed47b354f2041ec9b9da12a012.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a77a37_d98c_41ee_aa9e_e1b814507e2b.slice/crio-41894952995a3b3c0a0eca17cf2bc077920971ed47b354f2041ec9b9da12a012.scope\": RecentStats: unable to find data in memory cache]" Oct 08 14:27:10 crc kubenswrapper[4735]: I1008 14:27:10.964229 4735 generic.go:334] "Generic (PLEG): container finished" podID="34a77a37-d98c-41ee-aa9e-e1b814507e2b" containerID="41894952995a3b3c0a0eca17cf2bc077920971ed47b354f2041ec9b9da12a012" exitCode=0 Oct 08 14:27:10 crc kubenswrapper[4735]: I1008 14:27:10.964282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4kftk" event={"ID":"34a77a37-d98c-41ee-aa9e-e1b814507e2b","Type":"ContainerDied","Data":"41894952995a3b3c0a0eca17cf2bc077920971ed47b354f2041ec9b9da12a012"} Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.378541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.489077 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-config-data\") pod \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.489258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmwl8\" (UniqueName: \"kubernetes.io/projected/34a77a37-d98c-41ee-aa9e-e1b814507e2b-kube-api-access-tmwl8\") pod \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.489316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-db-sync-config-data\") pod \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.489382 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-combined-ca-bundle\") pod \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\" (UID: \"34a77a37-d98c-41ee-aa9e-e1b814507e2b\") " Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.499655 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "34a77a37-d98c-41ee-aa9e-e1b814507e2b" (UID: "34a77a37-d98c-41ee-aa9e-e1b814507e2b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.499841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a77a37-d98c-41ee-aa9e-e1b814507e2b-kube-api-access-tmwl8" (OuterVolumeSpecName: "kube-api-access-tmwl8") pod "34a77a37-d98c-41ee-aa9e-e1b814507e2b" (UID: "34a77a37-d98c-41ee-aa9e-e1b814507e2b"). InnerVolumeSpecName "kube-api-access-tmwl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.512106 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34a77a37-d98c-41ee-aa9e-e1b814507e2b" (UID: "34a77a37-d98c-41ee-aa9e-e1b814507e2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.535212 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-config-data" (OuterVolumeSpecName: "config-data") pod "34a77a37-d98c-41ee-aa9e-e1b814507e2b" (UID: "34a77a37-d98c-41ee-aa9e-e1b814507e2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.591141 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.591175 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.591185 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a77a37-d98c-41ee-aa9e-e1b814507e2b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.591194 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmwl8\" (UniqueName: \"kubernetes.io/projected/34a77a37-d98c-41ee-aa9e-e1b814507e2b-kube-api-access-tmwl8\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.981119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4kftk" event={"ID":"34a77a37-d98c-41ee-aa9e-e1b814507e2b","Type":"ContainerDied","Data":"43d0e3c4de3379b797c0130486b4a9a3e8a98dbc5d8473959cc23e36047c9c38"} Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.981162 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d0e3c4de3379b797c0130486b4a9a3e8a98dbc5d8473959cc23e36047c9c38" Oct 08 14:27:12 crc kubenswrapper[4735]: I1008 14:27:12.981198 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4kftk" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.260168 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:13 crc kubenswrapper[4735]: E1008 14:27:13.260955 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a77a37-d98c-41ee-aa9e-e1b814507e2b" containerName="glance-db-sync" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.260972 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a77a37-d98c-41ee-aa9e-e1b814507e2b" containerName="glance-db-sync" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.261147 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a77a37-d98c-41ee-aa9e-e1b814507e2b" containerName="glance-db-sync" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.262198 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.266518 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.266737 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.266831 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-796xw" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.266931 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.289920 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.384835 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d54fbc7c9-srvcc"] Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.386456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.409909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.409955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.410002 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.410031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-logs\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.410052 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.410070 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.410093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbrp8\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-kube-api-access-pbrp8\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.411383 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d54fbc7c9-srvcc"] Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.468029 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.469473 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.472867 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.493356 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.520011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-config\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.521693 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.521867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5xn6\" (UniqueName: \"kubernetes.io/projected/f648625a-5d58-41b8-b4e9-1942b0941318-kube-api-access-c5xn6\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-logs\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbrp8\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-kube-api-access-pbrp8\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-sb\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-dns-svc\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522690 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-nb\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.522868 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.523261 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-logs\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.523493 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.534597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.535828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-ceph\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.536507 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.540425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.558363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbrp8\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-kube-api-access-pbrp8\") pod \"glance-default-external-api-0\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.591316 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.625863 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.625926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.625949 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626076 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-sb\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znpg\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-kube-api-access-4znpg\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-dns-svc\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626236 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626265 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-nb\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626373 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626420 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-config\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626500 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5xn6\" (UniqueName: \"kubernetes.io/projected/f648625a-5d58-41b8-b4e9-1942b0941318-kube-api-access-c5xn6\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.626557 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.627294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-sb\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.627656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-nb\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.627686 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-dns-svc\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.627704 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-config\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.648582 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5xn6\" (UniqueName: \"kubernetes.io/projected/f648625a-5d58-41b8-b4e9-1942b0941318-kube-api-access-c5xn6\") pod \"dnsmasq-dns-6d54fbc7c9-srvcc\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.712886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728669 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4znpg\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-kube-api-access-4znpg\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728734 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.728780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.729177 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.729345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.733086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.765658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.766036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.777252 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znpg\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-kube-api-access-4znpg\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.795973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:13 crc kubenswrapper[4735]: I1008 14:27:13.822903 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:14 crc kubenswrapper[4735]: I1008 14:27:14.250991 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:14 crc kubenswrapper[4735]: I1008 14:27:14.306959 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d54fbc7c9-srvcc"] Oct 08 14:27:14 crc kubenswrapper[4735]: W1008 14:27:14.309292 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf648625a_5d58_41b8_b4e9_1942b0941318.slice/crio-a04b394fe57284c2429a104119cd0d6cb86833ea760cf629346b35510d352957 WatchSource:0}: Error finding container a04b394fe57284c2429a104119cd0d6cb86833ea760cf629346b35510d352957: Status 404 returned error can't find the container with id a04b394fe57284c2429a104119cd0d6cb86833ea760cf629346b35510d352957 Oct 08 14:27:14 crc kubenswrapper[4735]: I1008 14:27:14.455601 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:14 crc kubenswrapper[4735]: W1008 14:27:14.471093 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0378c5a_921f_4820_ae76_6982e824d3be.slice/crio-24d818018acdfc36615ca60c5e58961189e683399517d395a2ebd681631fe7ac WatchSource:0}: Error finding container 24d818018acdfc36615ca60c5e58961189e683399517d395a2ebd681631fe7ac: Status 404 returned error can't find the container with id 24d818018acdfc36615ca60c5e58961189e683399517d395a2ebd681631fe7ac Oct 08 14:27:14 crc kubenswrapper[4735]: I1008 14:27:14.850706 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.025915 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e0378c5a-921f-4820-ae76-6982e824d3be","Type":"ContainerStarted","Data":"24d818018acdfc36615ca60c5e58961189e683399517d395a2ebd681631fe7ac"} Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.027868 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e848fcc-9e0c-4b31-a230-161fd87a2c91","Type":"ContainerStarted","Data":"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281"} Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.027903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e848fcc-9e0c-4b31-a230-161fd87a2c91","Type":"ContainerStarted","Data":"303f73a7c42e75505833f802e31faa2bd467f25a864cb877c2e982c20f567e95"} Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.029337 4735 generic.go:334] "Generic (PLEG): container finished" podID="f648625a-5d58-41b8-b4e9-1942b0941318" containerID="0bb47edf43339ed5daab5366ef76be4b4d30d7087bfae3f141bf31529422a3a6" exitCode=0 Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.029367 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" event={"ID":"f648625a-5d58-41b8-b4e9-1942b0941318","Type":"ContainerDied","Data":"0bb47edf43339ed5daab5366ef76be4b4d30d7087bfae3f141bf31529422a3a6"} Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.029385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" event={"ID":"f648625a-5d58-41b8-b4e9-1942b0941318","Type":"ContainerStarted","Data":"a04b394fe57284c2429a104119cd0d6cb86833ea760cf629346b35510d352957"} Oct 08 14:27:15 crc kubenswrapper[4735]: I1008 14:27:15.894963 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:27:15 crc kubenswrapper[4735]: E1008 14:27:15.895707 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.041847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" event={"ID":"f648625a-5d58-41b8-b4e9-1942b0941318","Type":"ContainerStarted","Data":"b288b949edbc594ce98e2cd2ae8be6a23733f8908193b0408cda1e20065e3cc8"} Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.042886 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.044823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e0378c5a-921f-4820-ae76-6982e824d3be","Type":"ContainerStarted","Data":"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc"} Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.044869 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e0378c5a-921f-4820-ae76-6982e824d3be","Type":"ContainerStarted","Data":"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef"} Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.047147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e848fcc-9e0c-4b31-a230-161fd87a2c91","Type":"ContainerStarted","Data":"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29"} Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.047266 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-httpd" containerID="cri-o://1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29" gracePeriod=30 Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.047238 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-log" containerID="cri-o://4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281" gracePeriod=30 Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.065297 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" podStartSLOduration=3.065275138 podStartE2EDuration="3.065275138s" podCreationTimestamp="2025-10-08 14:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:16.058153546 +0000 UTC m=+5485.866458100" watchObservedRunningTime="2025-10-08 14:27:16.065275138 +0000 UTC m=+5485.873579682" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.092551 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.092530166 podStartE2EDuration="3.092530166s" podCreationTimestamp="2025-10-08 14:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:16.079962424 +0000 UTC m=+5485.888266968" watchObservedRunningTime="2025-10-08 14:27:16.092530166 +0000 UTC m=+5485.900834710" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.110539 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.110518407 podStartE2EDuration="3.110518407s" podCreationTimestamp="2025-10-08 14:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:16.104836251 +0000 UTC m=+5485.913140795" watchObservedRunningTime="2025-10-08 14:27:16.110518407 +0000 UTC m=+5485.918822951" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.667067 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788514 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-scripts\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788575 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-combined-ca-bundle\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-ceph\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-logs\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-httpd-run\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788812 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-config-data\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.788928 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbrp8\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-kube-api-access-pbrp8\") pod \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\" (UID: \"5e848fcc-9e0c-4b31-a230-161fd87a2c91\") " Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.790774 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-logs" (OuterVolumeSpecName: "logs") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.790933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.794778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-scripts" (OuterVolumeSpecName: "scripts") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.794825 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-kube-api-access-pbrp8" (OuterVolumeSpecName: "kube-api-access-pbrp8") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "kube-api-access-pbrp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.795325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-ceph" (OuterVolumeSpecName: "ceph") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.824820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.839590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-config-data" (OuterVolumeSpecName: "config-data") pod "5e848fcc-9e0c-4b31-a230-161fd87a2c91" (UID: "5e848fcc-9e0c-4b31-a230-161fd87a2c91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891214 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891247 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891257 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbrp8\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-kube-api-access-pbrp8\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891271 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891280 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e848fcc-9e0c-4b31-a230-161fd87a2c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891291 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e848fcc-9e0c-4b31-a230-161fd87a2c91-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:16 crc kubenswrapper[4735]: I1008 14:27:16.891299 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e848fcc-9e0c-4b31-a230-161fd87a2c91-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060774 4735 generic.go:334] "Generic (PLEG): container finished" podID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerID="1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29" exitCode=0 Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060807 4735 generic.go:334] "Generic (PLEG): container finished" podID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerID="4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281" exitCode=143 Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060848 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060870 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e848fcc-9e0c-4b31-a230-161fd87a2c91","Type":"ContainerDied","Data":"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29"} Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060915 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e848fcc-9e0c-4b31-a230-161fd87a2c91","Type":"ContainerDied","Data":"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281"} Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e848fcc-9e0c-4b31-a230-161fd87a2c91","Type":"ContainerDied","Data":"303f73a7c42e75505833f802e31faa2bd467f25a864cb877c2e982c20f567e95"} Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.060949 4735 scope.go:117] "RemoveContainer" containerID="1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.083980 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.088620 4735 scope.go:117] "RemoveContainer" containerID="4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.094164 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.109461 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:17 crc kubenswrapper[4735]: E1008 14:27:17.112932 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-httpd" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.112961 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-httpd" Oct 08 14:27:17 crc kubenswrapper[4735]: E1008 14:27:17.112981 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-log" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.112991 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-log" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.113218 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-log" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.113255 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" containerName="glance-httpd" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.114386 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.118693 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.125660 4735 scope.go:117] "RemoveContainer" containerID="1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29" Oct 08 14:27:17 crc kubenswrapper[4735]: E1008 14:27:17.127886 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29\": container with ID starting with 1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29 not found: ID does not exist" containerID="1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.127934 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29"} err="failed to get container status \"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29\": rpc error: code = NotFound desc = could not find container \"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29\": container with ID starting with 1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29 not found: ID does not exist" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.127962 4735 scope.go:117] "RemoveContainer" containerID="4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281" Oct 08 14:27:17 crc kubenswrapper[4735]: E1008 14:27:17.128724 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281\": container with ID starting with 4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281 not found: ID does not exist" containerID="4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.128773 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281"} err="failed to get container status \"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281\": rpc error: code = NotFound desc = could not find container \"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281\": container with ID starting with 4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281 not found: ID does not exist" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.128812 4735 scope.go:117] "RemoveContainer" containerID="1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.128896 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.129290 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29"} err="failed to get container status \"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29\": rpc error: code = NotFound desc = could not find container \"1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29\": container with ID starting with 1726fc434d3e28f43d74a5b80cfe518fabdddd9c5f6f363f5331bf0f388c7c29 not found: ID does not exist" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.129324 4735 scope.go:117] "RemoveContainer" containerID="4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.131926 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281"} err="failed to get container status \"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281\": rpc error: code = NotFound desc = could not find container \"4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281\": container with ID starting with 4d1c62bee42e7ff1210429e8c2f61d1bb35f036dbd77a17f97b8f1898df32281 not found: ID does not exist" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.167824 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-ceph\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kvpx\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-kube-api-access-6kvpx\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-config-data\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299571 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-scripts\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-logs\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.299697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvpx\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-kube-api-access-6kvpx\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401181 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-config-data\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-scripts\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401335 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-logs\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401376 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.401540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-ceph\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.402657 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.403901 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-logs\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.405944 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-ceph\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.407404 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.407793 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-config-data\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.408947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-scripts\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.424788 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvpx\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-kube-api-access-6kvpx\") pod \"glance-default-external-api-0\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.442664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:27:17 crc kubenswrapper[4735]: I1008 14:27:17.987355 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.086003 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-log" containerID="cri-o://e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc" gracePeriod=30 Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.086181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41","Type":"ContainerStarted","Data":"048eac7772ed4f7112ef49e2c9571e0d5bd1029cab4c85b81379e8c70ec0d44c"} Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.086697 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-httpd" containerID="cri-o://4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef" gracePeriod=30 Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.730522 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.907027 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e848fcc-9e0c-4b31-a230-161fd87a2c91" path="/var/lib/kubelet/pods/5e848fcc-9e0c-4b31-a230-161fd87a2c91/volumes" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930176 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-ceph\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-config-data\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930269 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-scripts\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930292 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4znpg\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-kube-api-access-4znpg\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-httpd-run\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-logs\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.930456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-combined-ca-bundle\") pod \"e0378c5a-921f-4820-ae76-6982e824d3be\" (UID: \"e0378c5a-921f-4820-ae76-6982e824d3be\") " Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.931697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-logs" (OuterVolumeSpecName: "logs") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.931835 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.934696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-scripts" (OuterVolumeSpecName: "scripts") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.934760 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-ceph" (OuterVolumeSpecName: "ceph") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.936785 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-kube-api-access-4znpg" (OuterVolumeSpecName: "kube-api-access-4znpg") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "kube-api-access-4znpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.978913 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:18 crc kubenswrapper[4735]: I1008 14:27:18.997018 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-config-data" (OuterVolumeSpecName: "config-data") pod "e0378c5a-921f-4820-ae76-6982e824d3be" (UID: "e0378c5a-921f-4820-ae76-6982e824d3be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032258 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032295 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032306 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032316 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0378c5a-921f-4820-ae76-6982e824d3be-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032328 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4znpg\" (UniqueName: \"kubernetes.io/projected/e0378c5a-921f-4820-ae76-6982e824d3be-kube-api-access-4znpg\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032342 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.032351 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0378c5a-921f-4820-ae76-6982e824d3be-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.095526 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41","Type":"ContainerStarted","Data":"1ba2ce7a3b0d5486acc32c051e9dc2b90b774da758c77049913033aec1b40467"} Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099671 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0378c5a-921f-4820-ae76-6982e824d3be" containerID="4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef" exitCode=0 Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099713 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0378c5a-921f-4820-ae76-6982e824d3be" containerID="e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc" exitCode=143 Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e0378c5a-921f-4820-ae76-6982e824d3be","Type":"ContainerDied","Data":"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef"} Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099754 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099779 4735 scope.go:117] "RemoveContainer" containerID="4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e0378c5a-921f-4820-ae76-6982e824d3be","Type":"ContainerDied","Data":"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc"} Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.099910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e0378c5a-921f-4820-ae76-6982e824d3be","Type":"ContainerDied","Data":"24d818018acdfc36615ca60c5e58961189e683399517d395a2ebd681631fe7ac"} Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.138034 4735 scope.go:117] "RemoveContainer" containerID="e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.143215 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.157255 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.170545 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:19 crc kubenswrapper[4735]: E1008 14:27:19.171221 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-log" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.171284 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-log" Oct 08 14:27:19 crc kubenswrapper[4735]: E1008 14:27:19.171326 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-httpd" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.171335 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-httpd" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.171599 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-httpd" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.171625 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" containerName="glance-log" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.172799 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.177456 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.179260 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.197568 4735 scope.go:117] "RemoveContainer" containerID="4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef" Oct 08 14:27:19 crc kubenswrapper[4735]: E1008 14:27:19.198880 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef\": container with ID starting with 4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef not found: ID does not exist" containerID="4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.198948 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef"} err="failed to get container status \"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef\": rpc error: code = NotFound desc = could not find container \"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef\": container with ID starting with 4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef not found: ID does not exist" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.198977 4735 scope.go:117] "RemoveContainer" containerID="e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc" Oct 08 14:27:19 crc kubenswrapper[4735]: E1008 14:27:19.199346 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc\": container with ID starting with e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc not found: ID does not exist" containerID="e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.199375 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc"} err="failed to get container status \"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc\": rpc error: code = NotFound desc = could not find container \"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc\": container with ID starting with e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc not found: ID does not exist" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.199397 4735 scope.go:117] "RemoveContainer" containerID="4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.202385 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef"} err="failed to get container status \"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef\": rpc error: code = NotFound desc = could not find container \"4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef\": container with ID starting with 4a2705557d709a239a345d36e6cfbb2855fa44f750d788e564d32da2330ebdef not found: ID does not exist" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.202450 4735 scope.go:117] "RemoveContainer" containerID="e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.202880 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc"} err="failed to get container status \"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc\": rpc error: code = NotFound desc = could not find container \"e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc\": container with ID starting with e98c3143a25d180373210f03feed186e494ffdee631c9cbd51cac3a4e9f56bbc not found: ID does not exist" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.235715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.236034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.236146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-ceph\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.236292 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.236438 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-logs\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.337869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-logs\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338438 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjsgf\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-kube-api-access-mjsgf\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338874 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.339001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-ceph\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338446 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.338777 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-logs\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.343520 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-ceph\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.344694 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.345059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.441623 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjsgf\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-kube-api-access-mjsgf\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.441932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.448164 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.465280 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjsgf\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-kube-api-access-mjsgf\") pod \"glance-default-internal-api-0\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.503297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:19 crc kubenswrapper[4735]: I1008 14:27:19.997285 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:27:20 crc kubenswrapper[4735]: I1008 14:27:20.111887 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45b47def-e0a2-4cd5-9ac7-fa036a784742","Type":"ContainerStarted","Data":"68602bd2d3ae69acca9771553e2489bcbaae3a33d7a5b5abaa74c86c88c6979c"} Oct 08 14:27:20 crc kubenswrapper[4735]: I1008 14:27:20.115835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41","Type":"ContainerStarted","Data":"196ee201bcd7a12b0b11edb03f309cec38878c0e0166164b57faeefdd66bbfaa"} Oct 08 14:27:20 crc kubenswrapper[4735]: I1008 14:27:20.140377 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.140358751 podStartE2EDuration="3.140358751s" podCreationTimestamp="2025-10-08 14:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:20.131533985 +0000 UTC m=+5489.939838519" watchObservedRunningTime="2025-10-08 14:27:20.140358751 +0000 UTC m=+5489.948663285" Oct 08 14:27:20 crc kubenswrapper[4735]: I1008 14:27:20.909405 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0378c5a-921f-4820-ae76-6982e824d3be" path="/var/lib/kubelet/pods/e0378c5a-921f-4820-ae76-6982e824d3be/volumes" Oct 08 14:27:21 crc kubenswrapper[4735]: I1008 14:27:21.154628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45b47def-e0a2-4cd5-9ac7-fa036a784742","Type":"ContainerStarted","Data":"30cdc921ca53cb001ea6e691a19df660f3a1910550fc8b9c3a020ec44842e0a1"} Oct 08 14:27:21 crc kubenswrapper[4735]: I1008 14:27:21.154691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45b47def-e0a2-4cd5-9ac7-fa036a784742","Type":"ContainerStarted","Data":"753016faca547e220e2e186780b247b2730ccd2124f4fb68a5f03aaa08372743"} Oct 08 14:27:21 crc kubenswrapper[4735]: I1008 14:27:21.177177 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.17715728 podStartE2EDuration="2.17715728s" podCreationTimestamp="2025-10-08 14:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:21.173164238 +0000 UTC m=+5490.981468782" watchObservedRunningTime="2025-10-08 14:27:21.17715728 +0000 UTC m=+5490.985461824" Oct 08 14:27:23 crc kubenswrapper[4735]: I1008 14:27:23.715274 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:27:23 crc kubenswrapper[4735]: I1008 14:27:23.789759 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766cb6844c-nfq7s"] Oct 08 14:27:23 crc kubenswrapper[4735]: I1008 14:27:23.790980 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerName="dnsmasq-dns" containerID="cri-o://c635fb16ccb4f5ac20435e79e91b155899fa9ba5142812c4e69c98d5d833a2d8" gracePeriod=10 Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.181697 4735 generic.go:334] "Generic (PLEG): container finished" podID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerID="c635fb16ccb4f5ac20435e79e91b155899fa9ba5142812c4e69c98d5d833a2d8" exitCode=0 Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.181965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" event={"ID":"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb","Type":"ContainerDied","Data":"c635fb16ccb4f5ac20435e79e91b155899fa9ba5142812c4e69c98d5d833a2d8"} Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.390444 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.529970 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-sb\") pod \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.530313 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-config\") pod \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.530449 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfswt\" (UniqueName: \"kubernetes.io/projected/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-kube-api-access-hfswt\") pod \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.530555 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-nb\") pod \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.530641 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-dns-svc\") pod \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\" (UID: \"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb\") " Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.535053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-kube-api-access-hfswt" (OuterVolumeSpecName: "kube-api-access-hfswt") pod "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" (UID: "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb"). InnerVolumeSpecName "kube-api-access-hfswt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.568649 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" (UID: "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.571952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" (UID: "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.572161 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" (UID: "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.573049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-config" (OuterVolumeSpecName: "config") pod "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" (UID: "95b88ee4-9136-4d7c-84f1-fe6d87a70cdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.631913 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.631953 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.631966 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfswt\" (UniqueName: \"kubernetes.io/projected/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-kube-api-access-hfswt\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.631977 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:24 crc kubenswrapper[4735]: I1008 14:27:24.631988 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:25 crc kubenswrapper[4735]: I1008 14:27:25.194008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" event={"ID":"95b88ee4-9136-4d7c-84f1-fe6d87a70cdb","Type":"ContainerDied","Data":"f2b548f20afb8e5100e81ff5970c0976796340f6517bee02f962be9b34591ccb"} Oct 08 14:27:25 crc kubenswrapper[4735]: I1008 14:27:25.194091 4735 scope.go:117] "RemoveContainer" containerID="c635fb16ccb4f5ac20435e79e91b155899fa9ba5142812c4e69c98d5d833a2d8" Oct 08 14:27:25 crc kubenswrapper[4735]: I1008 14:27:25.194095 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766cb6844c-nfq7s" Oct 08 14:27:25 crc kubenswrapper[4735]: I1008 14:27:25.221638 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766cb6844c-nfq7s"] Oct 08 14:27:25 crc kubenswrapper[4735]: I1008 14:27:25.226154 4735 scope.go:117] "RemoveContainer" containerID="0ddf10748c510c120c6a423ae3c34f4176dbd927dc9fdc6daf10472c39528936" Oct 08 14:27:25 crc kubenswrapper[4735]: I1008 14:27:25.228848 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-766cb6844c-nfq7s"] Oct 08 14:27:26 crc kubenswrapper[4735]: I1008 14:27:26.906031 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" path="/var/lib/kubelet/pods/95b88ee4-9136-4d7c-84f1-fe6d87a70cdb/volumes" Oct 08 14:27:27 crc kubenswrapper[4735]: I1008 14:27:27.443387 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 14:27:27 crc kubenswrapper[4735]: I1008 14:27:27.443446 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 14:27:27 crc kubenswrapper[4735]: I1008 14:27:27.473272 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 14:27:27 crc kubenswrapper[4735]: I1008 14:27:27.482885 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 14:27:27 crc kubenswrapper[4735]: I1008 14:27:27.894469 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:27:27 crc kubenswrapper[4735]: E1008 14:27:27.894731 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:27:28 crc kubenswrapper[4735]: I1008 14:27:28.223792 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 14:27:28 crc kubenswrapper[4735]: I1008 14:27:28.223842 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 14:27:29 crc kubenswrapper[4735]: I1008 14:27:29.504468 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:29 crc kubenswrapper[4735]: I1008 14:27:29.504819 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:29 crc kubenswrapper[4735]: I1008 14:27:29.533230 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:29 crc kubenswrapper[4735]: I1008 14:27:29.543015 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:30 crc kubenswrapper[4735]: I1008 14:27:30.237513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 14:27:30 crc kubenswrapper[4735]: I1008 14:27:30.247846 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:30 crc kubenswrapper[4735]: I1008 14:27:30.247917 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:30 crc kubenswrapper[4735]: I1008 14:27:30.248098 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 14:27:30 crc kubenswrapper[4735]: I1008 14:27:30.267151 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 14:27:32 crc kubenswrapper[4735]: I1008 14:27:32.223034 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:32 crc kubenswrapper[4735]: I1008 14:27:32.225756 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.520157 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-d4zkz"] Oct 08 14:27:38 crc kubenswrapper[4735]: E1008 14:27:38.521155 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerName="dnsmasq-dns" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.521170 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerName="dnsmasq-dns" Oct 08 14:27:38 crc kubenswrapper[4735]: E1008 14:27:38.521190 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerName="init" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.521197 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerName="init" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.521398 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b88ee4-9136-4d7c-84f1-fe6d87a70cdb" containerName="dnsmasq-dns" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.522113 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.537339 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d4zkz"] Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.678617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8trg\" (UniqueName: \"kubernetes.io/projected/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc-kube-api-access-j8trg\") pod \"placement-db-create-d4zkz\" (UID: \"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc\") " pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.780390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8trg\" (UniqueName: \"kubernetes.io/projected/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc-kube-api-access-j8trg\") pod \"placement-db-create-d4zkz\" (UID: \"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc\") " pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.803037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8trg\" (UniqueName: \"kubernetes.io/projected/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc-kube-api-access-j8trg\") pod \"placement-db-create-d4zkz\" (UID: \"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc\") " pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:38 crc kubenswrapper[4735]: I1008 14:27:38.846748 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:39 crc kubenswrapper[4735]: I1008 14:27:39.309732 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d4zkz"] Oct 08 14:27:39 crc kubenswrapper[4735]: W1008 14:27:39.313445 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b1a6ad9_ce4b_43f9_ae6e_f58bbde050bc.slice/crio-a62dd3ab2e735b99164ca8e98d1990c869c31f1d096bb720a5048f325bb475e2 WatchSource:0}: Error finding container a62dd3ab2e735b99164ca8e98d1990c869c31f1d096bb720a5048f325bb475e2: Status 404 returned error can't find the container with id a62dd3ab2e735b99164ca8e98d1990c869c31f1d096bb720a5048f325bb475e2 Oct 08 14:27:39 crc kubenswrapper[4735]: I1008 14:27:39.336405 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d4zkz" event={"ID":"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc","Type":"ContainerStarted","Data":"a62dd3ab2e735b99164ca8e98d1990c869c31f1d096bb720a5048f325bb475e2"} Oct 08 14:27:40 crc kubenswrapper[4735]: I1008 14:27:40.348327 4735 generic.go:334] "Generic (PLEG): container finished" podID="1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc" containerID="5e2bd3dd470fb0dbb4ca0ae7c84b4661bfab78a6a47eccb8dfbef58bad201925" exitCode=0 Oct 08 14:27:40 crc kubenswrapper[4735]: I1008 14:27:40.348456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d4zkz" event={"ID":"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc","Type":"ContainerDied","Data":"5e2bd3dd470fb0dbb4ca0ae7c84b4661bfab78a6a47eccb8dfbef58bad201925"} Oct 08 14:27:41 crc kubenswrapper[4735]: I1008 14:27:41.675260 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:41 crc kubenswrapper[4735]: I1008 14:27:41.834224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8trg\" (UniqueName: \"kubernetes.io/projected/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc-kube-api-access-j8trg\") pod \"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc\" (UID: \"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc\") " Oct 08 14:27:41 crc kubenswrapper[4735]: I1008 14:27:41.841365 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc-kube-api-access-j8trg" (OuterVolumeSpecName: "kube-api-access-j8trg") pod "1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc" (UID: "1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc"). InnerVolumeSpecName "kube-api-access-j8trg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:41 crc kubenswrapper[4735]: I1008 14:27:41.935750 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8trg\" (UniqueName: \"kubernetes.io/projected/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc-kube-api-access-j8trg\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:42 crc kubenswrapper[4735]: I1008 14:27:42.367904 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d4zkz" event={"ID":"1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc","Type":"ContainerDied","Data":"a62dd3ab2e735b99164ca8e98d1990c869c31f1d096bb720a5048f325bb475e2"} Oct 08 14:27:42 crc kubenswrapper[4735]: I1008 14:27:42.368149 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62dd3ab2e735b99164ca8e98d1990c869c31f1d096bb720a5048f325bb475e2" Oct 08 14:27:42 crc kubenswrapper[4735]: I1008 14:27:42.367965 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d4zkz" Oct 08 14:27:42 crc kubenswrapper[4735]: I1008 14:27:42.894573 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:27:42 crc kubenswrapper[4735]: E1008 14:27:42.894929 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.646514 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-cbfd-account-create-4mjzl"] Oct 08 14:27:48 crc kubenswrapper[4735]: E1008 14:27:48.647436 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc" containerName="mariadb-database-create" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.647452 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc" containerName="mariadb-database-create" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.647649 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc" containerName="mariadb-database-create" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.648181 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.655868 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.667789 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cbfd-account-create-4mjzl"] Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.751445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9j2q\" (UniqueName: \"kubernetes.io/projected/5f1748c0-046e-4af1-b62d-1115b49da20c-kube-api-access-c9j2q\") pod \"placement-cbfd-account-create-4mjzl\" (UID: \"5f1748c0-046e-4af1-b62d-1115b49da20c\") " pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.853144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9j2q\" (UniqueName: \"kubernetes.io/projected/5f1748c0-046e-4af1-b62d-1115b49da20c-kube-api-access-c9j2q\") pod \"placement-cbfd-account-create-4mjzl\" (UID: \"5f1748c0-046e-4af1-b62d-1115b49da20c\") " pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.873416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9j2q\" (UniqueName: \"kubernetes.io/projected/5f1748c0-046e-4af1-b62d-1115b49da20c-kube-api-access-c9j2q\") pod \"placement-cbfd-account-create-4mjzl\" (UID: \"5f1748c0-046e-4af1-b62d-1115b49da20c\") " pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:48 crc kubenswrapper[4735]: I1008 14:27:48.969795 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:49 crc kubenswrapper[4735]: I1008 14:27:49.466437 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cbfd-account-create-4mjzl"] Oct 08 14:27:50 crc kubenswrapper[4735]: I1008 14:27:50.444129 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f1748c0-046e-4af1-b62d-1115b49da20c" containerID="110bc7f6f7bfaf2cbc5fb42562e7fbc8d62992be329fd5e9129a36add1cb5d97" exitCode=0 Oct 08 14:27:50 crc kubenswrapper[4735]: I1008 14:27:50.444231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbfd-account-create-4mjzl" event={"ID":"5f1748c0-046e-4af1-b62d-1115b49da20c","Type":"ContainerDied","Data":"110bc7f6f7bfaf2cbc5fb42562e7fbc8d62992be329fd5e9129a36add1cb5d97"} Oct 08 14:27:50 crc kubenswrapper[4735]: I1008 14:27:50.444469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbfd-account-create-4mjzl" event={"ID":"5f1748c0-046e-4af1-b62d-1115b49da20c","Type":"ContainerStarted","Data":"611701e25e2917de1a3f364d57181dd2d845bd7d70972f0c803fff367204ce21"} Oct 08 14:27:51 crc kubenswrapper[4735]: I1008 14:27:51.729185 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:51 crc kubenswrapper[4735]: I1008 14:27:51.901282 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9j2q\" (UniqueName: \"kubernetes.io/projected/5f1748c0-046e-4af1-b62d-1115b49da20c-kube-api-access-c9j2q\") pod \"5f1748c0-046e-4af1-b62d-1115b49da20c\" (UID: \"5f1748c0-046e-4af1-b62d-1115b49da20c\") " Oct 08 14:27:51 crc kubenswrapper[4735]: I1008 14:27:51.906511 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1748c0-046e-4af1-b62d-1115b49da20c-kube-api-access-c9j2q" (OuterVolumeSpecName: "kube-api-access-c9j2q") pod "5f1748c0-046e-4af1-b62d-1115b49da20c" (UID: "5f1748c0-046e-4af1-b62d-1115b49da20c"). InnerVolumeSpecName "kube-api-access-c9j2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:52 crc kubenswrapper[4735]: I1008 14:27:52.004505 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9j2q\" (UniqueName: \"kubernetes.io/projected/5f1748c0-046e-4af1-b62d-1115b49da20c-kube-api-access-c9j2q\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:52 crc kubenswrapper[4735]: I1008 14:27:52.466611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbfd-account-create-4mjzl" event={"ID":"5f1748c0-046e-4af1-b62d-1115b49da20c","Type":"ContainerDied","Data":"611701e25e2917de1a3f364d57181dd2d845bd7d70972f0c803fff367204ce21"} Oct 08 14:27:52 crc kubenswrapper[4735]: I1008 14:27:52.466662 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbfd-account-create-4mjzl" Oct 08 14:27:52 crc kubenswrapper[4735]: I1008 14:27:52.466683 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611701e25e2917de1a3f364d57181dd2d845bd7d70972f0c803fff367204ce21" Oct 08 14:27:53 crc kubenswrapper[4735]: I1008 14:27:53.966740 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-657648467c-2fvxc"] Oct 08 14:27:53 crc kubenswrapper[4735]: E1008 14:27:53.968289 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1748c0-046e-4af1-b62d-1115b49da20c" containerName="mariadb-account-create" Oct 08 14:27:53 crc kubenswrapper[4735]: I1008 14:27:53.968401 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1748c0-046e-4af1-b62d-1115b49da20c" containerName="mariadb-account-create" Oct 08 14:27:53 crc kubenswrapper[4735]: I1008 14:27:53.968758 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1748c0-046e-4af1-b62d-1115b49da20c" containerName="mariadb-account-create" Oct 08 14:27:53 crc kubenswrapper[4735]: I1008 14:27:53.970179 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:53 crc kubenswrapper[4735]: I1008 14:27:53.981280 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-657648467c-2fvxc"] Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.009289 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9zftk"] Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.011441 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.016879 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.016948 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.017024 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8dfp5" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.024252 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9zftk"] Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142487 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-config-data\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-dns-svc\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94s2\" (UniqueName: \"kubernetes.io/projected/2eb52a80-9aca-4074-befb-3062309dfa71-kube-api-access-k94s2\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142710 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-config\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142738 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-scripts\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-logs\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-nb\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142790 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-combined-ca-bundle\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqpjh\" (UniqueName: \"kubernetes.io/projected/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-kube-api-access-gqpjh\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.142839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-sb\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244502 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94s2\" (UniqueName: \"kubernetes.io/projected/2eb52a80-9aca-4074-befb-3062309dfa71-kube-api-access-k94s2\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-config\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244616 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-scripts\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-logs\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-nb\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244669 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-combined-ca-bundle\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqpjh\" (UniqueName: \"kubernetes.io/projected/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-kube-api-access-gqpjh\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-sb\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244781 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-config-data\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.244796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-dns-svc\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.245462 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-logs\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.246030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-dns-svc\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.246161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-config\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.246179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-nb\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.246581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-sb\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.251204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-config-data\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.251617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-scripts\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.253955 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-combined-ca-bundle\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.266531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94s2\" (UniqueName: \"kubernetes.io/projected/2eb52a80-9aca-4074-befb-3062309dfa71-kube-api-access-k94s2\") pod \"dnsmasq-dns-657648467c-2fvxc\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.268192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqpjh\" (UniqueName: \"kubernetes.io/projected/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-kube-api-access-gqpjh\") pod \"placement-db-sync-9zftk\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.290158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.337354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.730257 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-657648467c-2fvxc"] Oct 08 14:27:54 crc kubenswrapper[4735]: W1008 14:27:54.733059 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eb52a80_9aca_4074_befb_3062309dfa71.slice/crio-c029c69022d944233cdc4412e7a784cd87fbe7ded42cebeec72c0a78e806a8ee WatchSource:0}: Error finding container c029c69022d944233cdc4412e7a784cd87fbe7ded42cebeec72c0a78e806a8ee: Status 404 returned error can't find the container with id c029c69022d944233cdc4412e7a784cd87fbe7ded42cebeec72c0a78e806a8ee Oct 08 14:27:54 crc kubenswrapper[4735]: I1008 14:27:54.857771 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9zftk"] Oct 08 14:27:55 crc kubenswrapper[4735]: I1008 14:27:55.494608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zftk" event={"ID":"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38","Type":"ContainerStarted","Data":"3c59c060a29f8d4240073cea3a259b5e477630b30ba6cdea9f0579b47ff48773"} Oct 08 14:27:55 crc kubenswrapper[4735]: I1008 14:27:55.494985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zftk" event={"ID":"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38","Type":"ContainerStarted","Data":"8d7292edfd40bdf5f126b6664c59e76f08b824cb0a8395e4d914398e24cb07f8"} Oct 08 14:27:55 crc kubenswrapper[4735]: I1008 14:27:55.497798 4735 generic.go:334] "Generic (PLEG): container finished" podID="2eb52a80-9aca-4074-befb-3062309dfa71" containerID="df09f784f303318c9f4e0ff50599baede28201338b776ef7cb8a76bf936d578b" exitCode=0 Oct 08 14:27:55 crc kubenswrapper[4735]: I1008 14:27:55.498025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657648467c-2fvxc" event={"ID":"2eb52a80-9aca-4074-befb-3062309dfa71","Type":"ContainerDied","Data":"df09f784f303318c9f4e0ff50599baede28201338b776ef7cb8a76bf936d578b"} Oct 08 14:27:55 crc kubenswrapper[4735]: I1008 14:27:55.498062 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657648467c-2fvxc" event={"ID":"2eb52a80-9aca-4074-befb-3062309dfa71","Type":"ContainerStarted","Data":"c029c69022d944233cdc4412e7a784cd87fbe7ded42cebeec72c0a78e806a8ee"} Oct 08 14:27:55 crc kubenswrapper[4735]: I1008 14:27:55.525143 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9zftk" podStartSLOduration=2.525122457 podStartE2EDuration="2.525122457s" podCreationTimestamp="2025-10-08 14:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:55.515889151 +0000 UTC m=+5525.324193705" watchObservedRunningTime="2025-10-08 14:27:55.525122457 +0000 UTC m=+5525.333427001" Oct 08 14:27:56 crc kubenswrapper[4735]: I1008 14:27:56.506607 4735 generic.go:334] "Generic (PLEG): container finished" podID="c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" containerID="3c59c060a29f8d4240073cea3a259b5e477630b30ba6cdea9f0579b47ff48773" exitCode=0 Oct 08 14:27:56 crc kubenswrapper[4735]: I1008 14:27:56.506715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zftk" event={"ID":"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38","Type":"ContainerDied","Data":"3c59c060a29f8d4240073cea3a259b5e477630b30ba6cdea9f0579b47ff48773"} Oct 08 14:27:56 crc kubenswrapper[4735]: I1008 14:27:56.509215 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657648467c-2fvxc" event={"ID":"2eb52a80-9aca-4074-befb-3062309dfa71","Type":"ContainerStarted","Data":"cf39f34fbb11368b315c72124736693180fdba8818ec69fb8d4565e925fb9380"} Oct 08 14:27:56 crc kubenswrapper[4735]: I1008 14:27:56.509961 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:27:56 crc kubenswrapper[4735]: I1008 14:27:56.546508 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-657648467c-2fvxc" podStartSLOduration=3.546459811 podStartE2EDuration="3.546459811s" podCreationTimestamp="2025-10-08 14:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:27:56.539504733 +0000 UTC m=+5526.347809287" watchObservedRunningTime="2025-10-08 14:27:56.546459811 +0000 UTC m=+5526.354764355" Oct 08 14:27:56 crc kubenswrapper[4735]: I1008 14:27:56.895772 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:27:56 crc kubenswrapper[4735]: E1008 14:27:56.896429 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:27:57 crc kubenswrapper[4735]: I1008 14:27:57.875377 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.019002 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-logs\") pod \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.019274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-combined-ca-bundle\") pod \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.019448 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-config-data\") pod \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.019532 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-logs" (OuterVolumeSpecName: "logs") pod "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" (UID: "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.019560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-scripts\") pod \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.019667 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqpjh\" (UniqueName: \"kubernetes.io/projected/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-kube-api-access-gqpjh\") pod \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\" (UID: \"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38\") " Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.020533 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.025966 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-scripts" (OuterVolumeSpecName: "scripts") pod "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" (UID: "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.033086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-kube-api-access-gqpjh" (OuterVolumeSpecName: "kube-api-access-gqpjh") pod "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" (UID: "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38"). InnerVolumeSpecName "kube-api-access-gqpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.051332 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" (UID: "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.055316 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-config-data" (OuterVolumeSpecName: "config-data") pod "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" (UID: "c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.122131 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.122164 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.122174 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.122182 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqpjh\" (UniqueName: \"kubernetes.io/projected/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38-kube-api-access-gqpjh\") on node \"crc\" DevicePath \"\"" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.527165 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zftk" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.527157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zftk" event={"ID":"c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38","Type":"ContainerDied","Data":"8d7292edfd40bdf5f126b6664c59e76f08b824cb0a8395e4d914398e24cb07f8"} Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.527304 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7292edfd40bdf5f126b6664c59e76f08b824cb0a8395e4d914398e24cb07f8" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.707382 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5cf4d6ff98-g56r5"] Oct 08 14:27:58 crc kubenswrapper[4735]: E1008 14:27:58.708152 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" containerName="placement-db-sync" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.708188 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" containerName="placement-db-sync" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.708575 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" containerName="placement-db-sync" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.710047 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.712221 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8dfp5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.712546 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.721874 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cf4d6ff98-g56r5"] Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.725514 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.833210 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-config-data\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.833311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490668b-e4ec-4dc2-900d-5d161f49c526-logs\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.833355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-scripts\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.833378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-combined-ca-bundle\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.833450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vvf\" (UniqueName: \"kubernetes.io/projected/6490668b-e4ec-4dc2-900d-5d161f49c526-kube-api-access-29vvf\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.935275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-scripts\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.935329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-combined-ca-bundle\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.935391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29vvf\" (UniqueName: \"kubernetes.io/projected/6490668b-e4ec-4dc2-900d-5d161f49c526-kube-api-access-29vvf\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.935459 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-config-data\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.935568 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490668b-e4ec-4dc2-900d-5d161f49c526-logs\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.936044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490668b-e4ec-4dc2-900d-5d161f49c526-logs\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.938553 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-scripts\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.938700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-config-data\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.940401 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490668b-e4ec-4dc2-900d-5d161f49c526-combined-ca-bundle\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:58 crc kubenswrapper[4735]: I1008 14:27:58.952961 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vvf\" (UniqueName: \"kubernetes.io/projected/6490668b-e4ec-4dc2-900d-5d161f49c526-kube-api-access-29vvf\") pod \"placement-5cf4d6ff98-g56r5\" (UID: \"6490668b-e4ec-4dc2-900d-5d161f49c526\") " pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:59 crc kubenswrapper[4735]: I1008 14:27:59.029835 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:27:59 crc kubenswrapper[4735]: I1008 14:27:59.476644 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cf4d6ff98-g56r5"] Oct 08 14:27:59 crc kubenswrapper[4735]: W1008 14:27:59.481691 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6490668b_e4ec_4dc2_900d_5d161f49c526.slice/crio-555178c14522e0c581c7b2acf588756bfba1fcf8551eba7aa29c9f658a667262 WatchSource:0}: Error finding container 555178c14522e0c581c7b2acf588756bfba1fcf8551eba7aa29c9f658a667262: Status 404 returned error can't find the container with id 555178c14522e0c581c7b2acf588756bfba1fcf8551eba7aa29c9f658a667262 Oct 08 14:27:59 crc kubenswrapper[4735]: I1008 14:27:59.536052 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cf4d6ff98-g56r5" event={"ID":"6490668b-e4ec-4dc2-900d-5d161f49c526","Type":"ContainerStarted","Data":"555178c14522e0c581c7b2acf588756bfba1fcf8551eba7aa29c9f658a667262"} Oct 08 14:28:00 crc kubenswrapper[4735]: I1008 14:28:00.545931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cf4d6ff98-g56r5" event={"ID":"6490668b-e4ec-4dc2-900d-5d161f49c526","Type":"ContainerStarted","Data":"c271dd14d991d3f6ee4dda8e4e2201ab7aa426386192e0e074451ab5b2e5d509"} Oct 08 14:28:00 crc kubenswrapper[4735]: I1008 14:28:00.546430 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:28:00 crc kubenswrapper[4735]: I1008 14:28:00.546454 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cf4d6ff98-g56r5" event={"ID":"6490668b-e4ec-4dc2-900d-5d161f49c526","Type":"ContainerStarted","Data":"825948698eb191890eaca5a74122e173e0f2e4add0e91a6dfb24cf500ba01050"} Oct 08 14:28:00 crc kubenswrapper[4735]: I1008 14:28:00.562574 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5cf4d6ff98-g56r5" podStartSLOduration=2.562553653 podStartE2EDuration="2.562553653s" podCreationTimestamp="2025-10-08 14:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:28:00.559581757 +0000 UTC m=+5530.367886371" watchObservedRunningTime="2025-10-08 14:28:00.562553653 +0000 UTC m=+5530.370858207" Oct 08 14:28:01 crc kubenswrapper[4735]: I1008 14:28:01.553829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.291635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.350226 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d54fbc7c9-srvcc"] Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.350510 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" containerName="dnsmasq-dns" containerID="cri-o://b288b949edbc594ce98e2cd2ae8be6a23733f8908193b0408cda1e20065e3cc8" gracePeriod=10 Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.582577 4735 generic.go:334] "Generic (PLEG): container finished" podID="f648625a-5d58-41b8-b4e9-1942b0941318" containerID="b288b949edbc594ce98e2cd2ae8be6a23733f8908193b0408cda1e20065e3cc8" exitCode=0 Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.582685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" event={"ID":"f648625a-5d58-41b8-b4e9-1942b0941318","Type":"ContainerDied","Data":"b288b949edbc594ce98e2cd2ae8be6a23733f8908193b0408cda1e20065e3cc8"} Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.807843 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.934260 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-dns-svc\") pod \"f648625a-5d58-41b8-b4e9-1942b0941318\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.934353 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-nb\") pod \"f648625a-5d58-41b8-b4e9-1942b0941318\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.934418 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-config\") pod \"f648625a-5d58-41b8-b4e9-1942b0941318\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.934497 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-sb\") pod \"f648625a-5d58-41b8-b4e9-1942b0941318\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.934549 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5xn6\" (UniqueName: \"kubernetes.io/projected/f648625a-5d58-41b8-b4e9-1942b0941318-kube-api-access-c5xn6\") pod \"f648625a-5d58-41b8-b4e9-1942b0941318\" (UID: \"f648625a-5d58-41b8-b4e9-1942b0941318\") " Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.939321 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f648625a-5d58-41b8-b4e9-1942b0941318-kube-api-access-c5xn6" (OuterVolumeSpecName: "kube-api-access-c5xn6") pod "f648625a-5d58-41b8-b4e9-1942b0941318" (UID: "f648625a-5d58-41b8-b4e9-1942b0941318"). InnerVolumeSpecName "kube-api-access-c5xn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.980880 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-config" (OuterVolumeSpecName: "config") pod "f648625a-5d58-41b8-b4e9-1942b0941318" (UID: "f648625a-5d58-41b8-b4e9-1942b0941318"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.982069 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f648625a-5d58-41b8-b4e9-1942b0941318" (UID: "f648625a-5d58-41b8-b4e9-1942b0941318"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.983044 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f648625a-5d58-41b8-b4e9-1942b0941318" (UID: "f648625a-5d58-41b8-b4e9-1942b0941318"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:28:04 crc kubenswrapper[4735]: I1008 14:28:04.989710 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f648625a-5d58-41b8-b4e9-1942b0941318" (UID: "f648625a-5d58-41b8-b4e9-1942b0941318"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.036356 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.036387 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.036396 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.036405 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5xn6\" (UniqueName: \"kubernetes.io/projected/f648625a-5d58-41b8-b4e9-1942b0941318-kube-api-access-c5xn6\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.036416 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f648625a-5d58-41b8-b4e9-1942b0941318-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.595722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" event={"ID":"f648625a-5d58-41b8-b4e9-1942b0941318","Type":"ContainerDied","Data":"a04b394fe57284c2429a104119cd0d6cb86833ea760cf629346b35510d352957"} Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.595767 4735 scope.go:117] "RemoveContainer" containerID="b288b949edbc594ce98e2cd2ae8be6a23733f8908193b0408cda1e20065e3cc8" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.595861 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d54fbc7c9-srvcc" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.621267 4735 scope.go:117] "RemoveContainer" containerID="0bb47edf43339ed5daab5366ef76be4b4d30d7087bfae3f141bf31529422a3a6" Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.630702 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d54fbc7c9-srvcc"] Oct 08 14:28:05 crc kubenswrapper[4735]: I1008 14:28:05.636735 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d54fbc7c9-srvcc"] Oct 08 14:28:06 crc kubenswrapper[4735]: I1008 14:28:06.906577 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" path="/var/lib/kubelet/pods/f648625a-5d58-41b8-b4e9-1942b0941318/volumes" Oct 08 14:28:08 crc kubenswrapper[4735]: I1008 14:28:08.376315 4735 scope.go:117] "RemoveContainer" containerID="990f4ba5ec3cd0d52f874f584429717e482226978afcf32c05f857bdd0b7be2e" Oct 08 14:28:08 crc kubenswrapper[4735]: I1008 14:28:08.393664 4735 scope.go:117] "RemoveContainer" containerID="9bb175713c4d9d6b792e2733a9fecfe2c2b546792b6b14bd9709fbd579ff9706" Oct 08 14:28:10 crc kubenswrapper[4735]: I1008 14:28:10.901564 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:28:10 crc kubenswrapper[4735]: E1008 14:28:10.902316 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.834420 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n8vlb"] Oct 08 14:28:24 crc kubenswrapper[4735]: E1008 14:28:24.835583 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" containerName="init" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.835601 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" containerName="init" Oct 08 14:28:24 crc kubenswrapper[4735]: E1008 14:28:24.835622 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" containerName="dnsmasq-dns" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.835631 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" containerName="dnsmasq-dns" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.835853 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f648625a-5d58-41b8-b4e9-1942b0941318" containerName="dnsmasq-dns" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.837613 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.851908 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8vlb"] Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.895512 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:28:24 crc kubenswrapper[4735]: E1008 14:28:24.895872 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.936039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbw45\" (UniqueName: \"kubernetes.io/projected/7ff4135f-19d1-4d3c-9c86-4189f70317aa-kube-api-access-rbw45\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.936098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-catalog-content\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:24 crc kubenswrapper[4735]: I1008 14:28:24.936310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-utilities\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.038050 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-utilities\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.038259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbw45\" (UniqueName: \"kubernetes.io/projected/7ff4135f-19d1-4d3c-9c86-4189f70317aa-kube-api-access-rbw45\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.038304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-catalog-content\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.038584 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-utilities\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.038859 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-catalog-content\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.060439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbw45\" (UniqueName: \"kubernetes.io/projected/7ff4135f-19d1-4d3c-9c86-4189f70317aa-kube-api-access-rbw45\") pod \"redhat-marketplace-n8vlb\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.165799 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.589961 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8vlb"] Oct 08 14:28:25 crc kubenswrapper[4735]: I1008 14:28:25.764272 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8vlb" event={"ID":"7ff4135f-19d1-4d3c-9c86-4189f70317aa","Type":"ContainerStarted","Data":"e465faa7638d86db6583cddbb656ff9841c34ad992fbc9a3f8e74f2816a6754b"} Oct 08 14:28:26 crc kubenswrapper[4735]: I1008 14:28:26.775458 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerID="3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc" exitCode=0 Oct 08 14:28:26 crc kubenswrapper[4735]: I1008 14:28:26.775568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8vlb" event={"ID":"7ff4135f-19d1-4d3c-9c86-4189f70317aa","Type":"ContainerDied","Data":"3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc"} Oct 08 14:28:26 crc kubenswrapper[4735]: I1008 14:28:26.777968 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:28:28 crc kubenswrapper[4735]: I1008 14:28:28.805311 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerID="ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17" exitCode=0 Oct 08 14:28:28 crc kubenswrapper[4735]: I1008 14:28:28.805628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8vlb" event={"ID":"7ff4135f-19d1-4d3c-9c86-4189f70317aa","Type":"ContainerDied","Data":"ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17"} Oct 08 14:28:29 crc kubenswrapper[4735]: I1008 14:28:29.814234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8vlb" event={"ID":"7ff4135f-19d1-4d3c-9c86-4189f70317aa","Type":"ContainerStarted","Data":"2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f"} Oct 08 14:28:29 crc kubenswrapper[4735]: I1008 14:28:29.834730 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n8vlb" podStartSLOduration=3.32184987 podStartE2EDuration="5.834712718s" podCreationTimestamp="2025-10-08 14:28:24 +0000 UTC" firstStartedPulling="2025-10-08 14:28:26.777761807 +0000 UTC m=+5556.586066351" lastFinishedPulling="2025-10-08 14:28:29.290624655 +0000 UTC m=+5559.098929199" observedRunningTime="2025-10-08 14:28:29.829396702 +0000 UTC m=+5559.637701246" watchObservedRunningTime="2025-10-08 14:28:29.834712718 +0000 UTC m=+5559.643017262" Oct 08 14:28:30 crc kubenswrapper[4735]: I1008 14:28:30.127919 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:28:30 crc kubenswrapper[4735]: I1008 14:28:30.137552 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cf4d6ff98-g56r5" Oct 08 14:28:35 crc kubenswrapper[4735]: I1008 14:28:35.166145 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:35 crc kubenswrapper[4735]: I1008 14:28:35.167809 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:35 crc kubenswrapper[4735]: I1008 14:28:35.217729 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:35 crc kubenswrapper[4735]: I1008 14:28:35.913435 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:35 crc kubenswrapper[4735]: I1008 14:28:35.968657 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8vlb"] Oct 08 14:28:37 crc kubenswrapper[4735]: I1008 14:28:37.883710 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n8vlb" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="registry-server" containerID="cri-o://2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f" gracePeriod=2 Oct 08 14:28:37 crc kubenswrapper[4735]: I1008 14:28:37.896289 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:28:37 crc kubenswrapper[4735]: E1008 14:28:37.896532 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.340392 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.492052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbw45\" (UniqueName: \"kubernetes.io/projected/7ff4135f-19d1-4d3c-9c86-4189f70317aa-kube-api-access-rbw45\") pod \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.493258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-utilities\") pod \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.493369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-catalog-content\") pod \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\" (UID: \"7ff4135f-19d1-4d3c-9c86-4189f70317aa\") " Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.494219 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-utilities" (OuterVolumeSpecName: "utilities") pod "7ff4135f-19d1-4d3c-9c86-4189f70317aa" (UID: "7ff4135f-19d1-4d3c-9c86-4189f70317aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.502081 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff4135f-19d1-4d3c-9c86-4189f70317aa-kube-api-access-rbw45" (OuterVolumeSpecName: "kube-api-access-rbw45") pod "7ff4135f-19d1-4d3c-9c86-4189f70317aa" (UID: "7ff4135f-19d1-4d3c-9c86-4189f70317aa"). InnerVolumeSpecName "kube-api-access-rbw45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.505290 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ff4135f-19d1-4d3c-9c86-4189f70317aa" (UID: "7ff4135f-19d1-4d3c-9c86-4189f70317aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.596036 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.596290 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbw45\" (UniqueName: \"kubernetes.io/projected/7ff4135f-19d1-4d3c-9c86-4189f70317aa-kube-api-access-rbw45\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.596363 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff4135f-19d1-4d3c-9c86-4189f70317aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.906029 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerID="2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f" exitCode=0 Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.906878 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8vlb" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.933247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8vlb" event={"ID":"7ff4135f-19d1-4d3c-9c86-4189f70317aa","Type":"ContainerDied","Data":"2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f"} Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.933292 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8vlb" event={"ID":"7ff4135f-19d1-4d3c-9c86-4189f70317aa","Type":"ContainerDied","Data":"e465faa7638d86db6583cddbb656ff9841c34ad992fbc9a3f8e74f2816a6754b"} Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.933312 4735 scope.go:117] "RemoveContainer" containerID="2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f" Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.953919 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8vlb"] Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.962238 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8vlb"] Oct 08 14:28:38 crc kubenswrapper[4735]: I1008 14:28:38.992445 4735 scope.go:117] "RemoveContainer" containerID="ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.014423 4735 scope.go:117] "RemoveContainer" containerID="3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.059944 4735 scope.go:117] "RemoveContainer" containerID="2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f" Oct 08 14:28:39 crc kubenswrapper[4735]: E1008 14:28:39.060905 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f\": container with ID starting with 2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f not found: ID does not exist" containerID="2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.060990 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f"} err="failed to get container status \"2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f\": rpc error: code = NotFound desc = could not find container \"2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f\": container with ID starting with 2523e85c70d4509d188005ce65a6f9127de5b0f810d2160a1b6ffccdd030a67f not found: ID does not exist" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.061022 4735 scope.go:117] "RemoveContainer" containerID="ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17" Oct 08 14:28:39 crc kubenswrapper[4735]: E1008 14:28:39.061562 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17\": container with ID starting with ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17 not found: ID does not exist" containerID="ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.061605 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17"} err="failed to get container status \"ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17\": rpc error: code = NotFound desc = could not find container \"ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17\": container with ID starting with ff8862e8dc08191fac6ea6ae9d86e42bddd62e6d268e2576d67f1aab0859dd17 not found: ID does not exist" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.061633 4735 scope.go:117] "RemoveContainer" containerID="3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc" Oct 08 14:28:39 crc kubenswrapper[4735]: E1008 14:28:39.061987 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc\": container with ID starting with 3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc not found: ID does not exist" containerID="3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc" Oct 08 14:28:39 crc kubenswrapper[4735]: I1008 14:28:39.062010 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc"} err="failed to get container status \"3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc\": rpc error: code = NotFound desc = could not find container \"3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc\": container with ID starting with 3df4b1aa765298c976c8365e2d6ac64664641e419f69cfda77903b35deaf5edc not found: ID does not exist" Oct 08 14:28:40 crc kubenswrapper[4735]: I1008 14:28:40.905179 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" path="/var/lib/kubelet/pods/7ff4135f-19d1-4d3c-9c86-4189f70317aa/volumes" Oct 08 14:28:48 crc kubenswrapper[4735]: I1008 14:28:48.894296 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:28:48 crc kubenswrapper[4735]: E1008 14:28:48.894939 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.499932 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2hrm4"] Oct 08 14:28:53 crc kubenswrapper[4735]: E1008 14:28:53.500944 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="extract-utilities" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.500964 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="extract-utilities" Oct 08 14:28:53 crc kubenswrapper[4735]: E1008 14:28:53.501017 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="registry-server" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.501025 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="registry-server" Oct 08 14:28:53 crc kubenswrapper[4735]: E1008 14:28:53.501041 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="extract-content" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.501058 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="extract-content" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.501280 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff4135f-19d1-4d3c-9c86-4189f70317aa" containerName="registry-server" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.502034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.512490 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2hrm4"] Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.594522 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mh2k9"] Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.595852 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.603295 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mh2k9"] Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.679132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gv86\" (UniqueName: \"kubernetes.io/projected/64f6de72-94b0-4fc8-b036-c53df917c0f8-kube-api-access-2gv86\") pod \"nova-api-db-create-2hrm4\" (UID: \"64f6de72-94b0-4fc8-b036-c53df917c0f8\") " pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.709786 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-f6z9w"] Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.712171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.741146 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f6z9w"] Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.781208 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gv86\" (UniqueName: \"kubernetes.io/projected/64f6de72-94b0-4fc8-b036-c53df917c0f8-kube-api-access-2gv86\") pod \"nova-api-db-create-2hrm4\" (UID: \"64f6de72-94b0-4fc8-b036-c53df917c0f8\") " pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.781378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5h6x\" (UniqueName: \"kubernetes.io/projected/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a-kube-api-access-p5h6x\") pod \"nova-cell0-db-create-mh2k9\" (UID: \"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a\") " pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.805112 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gv86\" (UniqueName: \"kubernetes.io/projected/64f6de72-94b0-4fc8-b036-c53df917c0f8-kube-api-access-2gv86\") pod \"nova-api-db-create-2hrm4\" (UID: \"64f6de72-94b0-4fc8-b036-c53df917c0f8\") " pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.833877 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.884210 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5h6x\" (UniqueName: \"kubernetes.io/projected/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a-kube-api-access-p5h6x\") pod \"nova-cell0-db-create-mh2k9\" (UID: \"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a\") " pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.884373 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89wh\" (UniqueName: \"kubernetes.io/projected/42e95717-223a-491f-b378-be990176c7b0-kube-api-access-w89wh\") pod \"nova-cell1-db-create-f6z9w\" (UID: \"42e95717-223a-491f-b378-be990176c7b0\") " pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.905002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5h6x\" (UniqueName: \"kubernetes.io/projected/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a-kube-api-access-p5h6x\") pod \"nova-cell0-db-create-mh2k9\" (UID: \"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a\") " pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.915203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:28:53 crc kubenswrapper[4735]: I1008 14:28:53.985810 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89wh\" (UniqueName: \"kubernetes.io/projected/42e95717-223a-491f-b378-be990176c7b0-kube-api-access-w89wh\") pod \"nova-cell1-db-create-f6z9w\" (UID: \"42e95717-223a-491f-b378-be990176c7b0\") " pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:54 crc kubenswrapper[4735]: I1008 14:28:54.013723 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89wh\" (UniqueName: \"kubernetes.io/projected/42e95717-223a-491f-b378-be990176c7b0-kube-api-access-w89wh\") pod \"nova-cell1-db-create-f6z9w\" (UID: \"42e95717-223a-491f-b378-be990176c7b0\") " pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:54 crc kubenswrapper[4735]: I1008 14:28:54.040648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:54 crc kubenswrapper[4735]: I1008 14:28:54.311759 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2hrm4"] Oct 08 14:28:54 crc kubenswrapper[4735]: W1008 14:28:54.316469 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64f6de72_94b0_4fc8_b036_c53df917c0f8.slice/crio-3c6da8d8b092827f384a6318d348d67f919d96e1cbaf00b1e4e3a3806ccb9e9d WatchSource:0}: Error finding container 3c6da8d8b092827f384a6318d348d67f919d96e1cbaf00b1e4e3a3806ccb9e9d: Status 404 returned error can't find the container with id 3c6da8d8b092827f384a6318d348d67f919d96e1cbaf00b1e4e3a3806ccb9e9d Oct 08 14:28:54 crc kubenswrapper[4735]: I1008 14:28:54.431976 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mh2k9"] Oct 08 14:28:54 crc kubenswrapper[4735]: W1008 14:28:54.433720 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c4c8f5_4b2e_438e_8b89_ec5eb3b9d01a.slice/crio-3ce52f11576f000a7e2261c519a3fbc8a8069d5d25dc7be3d9dda6ffe27352e1 WatchSource:0}: Error finding container 3ce52f11576f000a7e2261c519a3fbc8a8069d5d25dc7be3d9dda6ffe27352e1: Status 404 returned error can't find the container with id 3ce52f11576f000a7e2261c519a3fbc8a8069d5d25dc7be3d9dda6ffe27352e1 Oct 08 14:28:54 crc kubenswrapper[4735]: I1008 14:28:54.532855 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f6z9w"] Oct 08 14:28:54 crc kubenswrapper[4735]: W1008 14:28:54.592779 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42e95717_223a_491f_b378_be990176c7b0.slice/crio-b218ec58a7d0896558362a0bc40ad97496a6db5754c49db244c8df33a8a15957 WatchSource:0}: Error finding container b218ec58a7d0896558362a0bc40ad97496a6db5754c49db244c8df33a8a15957: Status 404 returned error can't find the container with id b218ec58a7d0896558362a0bc40ad97496a6db5754c49db244c8df33a8a15957 Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.044010 4735 generic.go:334] "Generic (PLEG): container finished" podID="64f6de72-94b0-4fc8-b036-c53df917c0f8" containerID="7f610a6ca52fde1a70cf0ee9f627c6049dc699a4631062a89e62c8e869bb3362" exitCode=0 Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.044069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2hrm4" event={"ID":"64f6de72-94b0-4fc8-b036-c53df917c0f8","Type":"ContainerDied","Data":"7f610a6ca52fde1a70cf0ee9f627c6049dc699a4631062a89e62c8e869bb3362"} Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.044132 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2hrm4" event={"ID":"64f6de72-94b0-4fc8-b036-c53df917c0f8","Type":"ContainerStarted","Data":"3c6da8d8b092827f384a6318d348d67f919d96e1cbaf00b1e4e3a3806ccb9e9d"} Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.045970 4735 generic.go:334] "Generic (PLEG): container finished" podID="42e95717-223a-491f-b378-be990176c7b0" containerID="98eb2a7a37aefaa11b22cfcb4dac9d1fe9e33a121b719a1c66c890a7e68d3884" exitCode=0 Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.046031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f6z9w" event={"ID":"42e95717-223a-491f-b378-be990176c7b0","Type":"ContainerDied","Data":"98eb2a7a37aefaa11b22cfcb4dac9d1fe9e33a121b719a1c66c890a7e68d3884"} Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.046102 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f6z9w" event={"ID":"42e95717-223a-491f-b378-be990176c7b0","Type":"ContainerStarted","Data":"b218ec58a7d0896558362a0bc40ad97496a6db5754c49db244c8df33a8a15957"} Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.048437 4735 generic.go:334] "Generic (PLEG): container finished" podID="42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a" containerID="39f5a32d43f9b17ff945b6cfb6210993702fa6aae857594cef2a83b0ab8c376b" exitCode=0 Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.048469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mh2k9" event={"ID":"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a","Type":"ContainerDied","Data":"39f5a32d43f9b17ff945b6cfb6210993702fa6aae857594cef2a83b0ab8c376b"} Oct 08 14:28:55 crc kubenswrapper[4735]: I1008 14:28:55.048531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mh2k9" event={"ID":"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a","Type":"ContainerStarted","Data":"3ce52f11576f000a7e2261c519a3fbc8a8069d5d25dc7be3d9dda6ffe27352e1"} Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.432017 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.438435 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.446386 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.526813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gv86\" (UniqueName: \"kubernetes.io/projected/64f6de72-94b0-4fc8-b036-c53df917c0f8-kube-api-access-2gv86\") pod \"64f6de72-94b0-4fc8-b036-c53df917c0f8\" (UID: \"64f6de72-94b0-4fc8-b036-c53df917c0f8\") " Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.526958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5h6x\" (UniqueName: \"kubernetes.io/projected/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a-kube-api-access-p5h6x\") pod \"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a\" (UID: \"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a\") " Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.532693 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f6de72-94b0-4fc8-b036-c53df917c0f8-kube-api-access-2gv86" (OuterVolumeSpecName: "kube-api-access-2gv86") pod "64f6de72-94b0-4fc8-b036-c53df917c0f8" (UID: "64f6de72-94b0-4fc8-b036-c53df917c0f8"). InnerVolumeSpecName "kube-api-access-2gv86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.532973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a-kube-api-access-p5h6x" (OuterVolumeSpecName: "kube-api-access-p5h6x") pod "42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a" (UID: "42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a"). InnerVolumeSpecName "kube-api-access-p5h6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.628012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w89wh\" (UniqueName: \"kubernetes.io/projected/42e95717-223a-491f-b378-be990176c7b0-kube-api-access-w89wh\") pod \"42e95717-223a-491f-b378-be990176c7b0\" (UID: \"42e95717-223a-491f-b378-be990176c7b0\") " Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.628584 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gv86\" (UniqueName: \"kubernetes.io/projected/64f6de72-94b0-4fc8-b036-c53df917c0f8-kube-api-access-2gv86\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.628601 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5h6x\" (UniqueName: \"kubernetes.io/projected/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a-kube-api-access-p5h6x\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.630714 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e95717-223a-491f-b378-be990176c7b0-kube-api-access-w89wh" (OuterVolumeSpecName: "kube-api-access-w89wh") pod "42e95717-223a-491f-b378-be990176c7b0" (UID: "42e95717-223a-491f-b378-be990176c7b0"). InnerVolumeSpecName "kube-api-access-w89wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:28:56 crc kubenswrapper[4735]: I1008 14:28:56.730105 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w89wh\" (UniqueName: \"kubernetes.io/projected/42e95717-223a-491f-b378-be990176c7b0-kube-api-access-w89wh\") on node \"crc\" DevicePath \"\"" Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.069017 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2hrm4" event={"ID":"64f6de72-94b0-4fc8-b036-c53df917c0f8","Type":"ContainerDied","Data":"3c6da8d8b092827f384a6318d348d67f919d96e1cbaf00b1e4e3a3806ccb9e9d"} Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.069067 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6da8d8b092827f384a6318d348d67f919d96e1cbaf00b1e4e3a3806ccb9e9d" Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.069133 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2hrm4" Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.072411 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f6z9w" Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.072814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f6z9w" event={"ID":"42e95717-223a-491f-b378-be990176c7b0","Type":"ContainerDied","Data":"b218ec58a7d0896558362a0bc40ad97496a6db5754c49db244c8df33a8a15957"} Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.072848 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b218ec58a7d0896558362a0bc40ad97496a6db5754c49db244c8df33a8a15957" Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.074212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mh2k9" event={"ID":"42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a","Type":"ContainerDied","Data":"3ce52f11576f000a7e2261c519a3fbc8a8069d5d25dc7be3d9dda6ffe27352e1"} Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.074268 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ce52f11576f000a7e2261c519a3fbc8a8069d5d25dc7be3d9dda6ffe27352e1" Oct 08 14:28:57 crc kubenswrapper[4735]: I1008 14:28:57.074231 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mh2k9" Oct 08 14:29:01 crc kubenswrapper[4735]: I1008 14:29:01.912880 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.134921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"d7e72bb2e03843dd08508e1aaf6943aa7ade5521224f4749387f79d3a9ac80f3"} Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.738728 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4102-account-create-pb929"] Oct 08 14:29:03 crc kubenswrapper[4735]: E1008 14:29:03.739215 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f6de72-94b0-4fc8-b036-c53df917c0f8" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.739236 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f6de72-94b0-4fc8-b036-c53df917c0f8" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: E1008 14:29:03.739280 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.739290 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: E1008 14:29:03.739312 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e95717-223a-491f-b378-be990176c7b0" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.739321 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e95717-223a-491f-b378-be990176c7b0" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.739543 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e95717-223a-491f-b378-be990176c7b0" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.739568 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.739585 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f6de72-94b0-4fc8-b036-c53df917c0f8" containerName="mariadb-database-create" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.740324 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.744442 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.747198 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4102-account-create-pb929"] Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.855566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmwd4\" (UniqueName: \"kubernetes.io/projected/5b895276-2adc-4f16-8a1f-04ed071342e7-kube-api-access-jmwd4\") pod \"nova-api-4102-account-create-pb929\" (UID: \"5b895276-2adc-4f16-8a1f-04ed071342e7\") " pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.924077 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f8ac-account-create-x5vzj"] Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.925656 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.927671 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.934110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f8ac-account-create-x5vzj"] Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.958551 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmwd4\" (UniqueName: \"kubernetes.io/projected/5b895276-2adc-4f16-8a1f-04ed071342e7-kube-api-access-jmwd4\") pod \"nova-api-4102-account-create-pb929\" (UID: \"5b895276-2adc-4f16-8a1f-04ed071342e7\") " pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:03 crc kubenswrapper[4735]: I1008 14:29:03.988655 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmwd4\" (UniqueName: \"kubernetes.io/projected/5b895276-2adc-4f16-8a1f-04ed071342e7-kube-api-access-jmwd4\") pod \"nova-api-4102-account-create-pb929\" (UID: \"5b895276-2adc-4f16-8a1f-04ed071342e7\") " pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.060323 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwnml\" (UniqueName: \"kubernetes.io/projected/02e5378c-61f6-43b9-a69b-0a2004e44291-kube-api-access-jwnml\") pod \"nova-cell0-f8ac-account-create-x5vzj\" (UID: \"02e5378c-61f6-43b9-a69b-0a2004e44291\") " pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.065864 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.132237 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0b6c-account-create-wrs4w"] Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.133694 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.138730 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.141750 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0b6c-account-create-wrs4w"] Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.162139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwnml\" (UniqueName: \"kubernetes.io/projected/02e5378c-61f6-43b9-a69b-0a2004e44291-kube-api-access-jwnml\") pod \"nova-cell0-f8ac-account-create-x5vzj\" (UID: \"02e5378c-61f6-43b9-a69b-0a2004e44291\") " pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.190706 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwnml\" (UniqueName: \"kubernetes.io/projected/02e5378c-61f6-43b9-a69b-0a2004e44291-kube-api-access-jwnml\") pod \"nova-cell0-f8ac-account-create-x5vzj\" (UID: \"02e5378c-61f6-43b9-a69b-0a2004e44291\") " pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.244280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.264227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m695g\" (UniqueName: \"kubernetes.io/projected/8728623f-2ab6-4156-b915-a376fe953042-kube-api-access-m695g\") pod \"nova-cell1-0b6c-account-create-wrs4w\" (UID: \"8728623f-2ab6-4156-b915-a376fe953042\") " pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.365985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m695g\" (UniqueName: \"kubernetes.io/projected/8728623f-2ab6-4156-b915-a376fe953042-kube-api-access-m695g\") pod \"nova-cell1-0b6c-account-create-wrs4w\" (UID: \"8728623f-2ab6-4156-b915-a376fe953042\") " pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.385795 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m695g\" (UniqueName: \"kubernetes.io/projected/8728623f-2ab6-4156-b915-a376fe953042-kube-api-access-m695g\") pod \"nova-cell1-0b6c-account-create-wrs4w\" (UID: \"8728623f-2ab6-4156-b915-a376fe953042\") " pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.536817 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4102-account-create-pb929"] Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.545821 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.730514 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f8ac-account-create-x5vzj"] Oct 08 14:29:04 crc kubenswrapper[4735]: W1008 14:29:04.732845 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e5378c_61f6_43b9_a69b_0a2004e44291.slice/crio-334e7db70c35981acf22d080e7d6c8891d3f60e9fa857924960ac11348506567 WatchSource:0}: Error finding container 334e7db70c35981acf22d080e7d6c8891d3f60e9fa857924960ac11348506567: Status 404 returned error can't find the container with id 334e7db70c35981acf22d080e7d6c8891d3f60e9fa857924960ac11348506567 Oct 08 14:29:04 crc kubenswrapper[4735]: I1008 14:29:04.768045 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0b6c-account-create-wrs4w"] Oct 08 14:29:04 crc kubenswrapper[4735]: W1008 14:29:04.833746 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8728623f_2ab6_4156_b915_a376fe953042.slice/crio-11b5acb000d700588afe5c060fc91555bdb9f1370c60951a080571f40fceb79a WatchSource:0}: Error finding container 11b5acb000d700588afe5c060fc91555bdb9f1370c60951a080571f40fceb79a: Status 404 returned error can't find the container with id 11b5acb000d700588afe5c060fc91555bdb9f1370c60951a080571f40fceb79a Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.183963 4735 generic.go:334] "Generic (PLEG): container finished" podID="8728623f-2ab6-4156-b915-a376fe953042" containerID="e277bbac2b51a462bdc21d57f77af0327826246317ad90fbd88ca1b0254596fe" exitCode=0 Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.184451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" event={"ID":"8728623f-2ab6-4156-b915-a376fe953042","Type":"ContainerDied","Data":"e277bbac2b51a462bdc21d57f77af0327826246317ad90fbd88ca1b0254596fe"} Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.184497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" event={"ID":"8728623f-2ab6-4156-b915-a376fe953042","Type":"ContainerStarted","Data":"11b5acb000d700588afe5c060fc91555bdb9f1370c60951a080571f40fceb79a"} Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.188057 4735 generic.go:334] "Generic (PLEG): container finished" podID="02e5378c-61f6-43b9-a69b-0a2004e44291" containerID="b8b583e4f64b8ee9bcd8e88460dffe6d69471197ec74c19a97363d7f12516e33" exitCode=0 Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.188125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" event={"ID":"02e5378c-61f6-43b9-a69b-0a2004e44291","Type":"ContainerDied","Data":"b8b583e4f64b8ee9bcd8e88460dffe6d69471197ec74c19a97363d7f12516e33"} Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.188147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" event={"ID":"02e5378c-61f6-43b9-a69b-0a2004e44291","Type":"ContainerStarted","Data":"334e7db70c35981acf22d080e7d6c8891d3f60e9fa857924960ac11348506567"} Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.190384 4735 generic.go:334] "Generic (PLEG): container finished" podID="5b895276-2adc-4f16-8a1f-04ed071342e7" containerID="fc36b142d01cc7ac85be46df09c036d26a2c51fa3edb11b78ba9db596eedb5b1" exitCode=0 Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.190434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4102-account-create-pb929" event={"ID":"5b895276-2adc-4f16-8a1f-04ed071342e7","Type":"ContainerDied","Data":"fc36b142d01cc7ac85be46df09c036d26a2c51fa3edb11b78ba9db596eedb5b1"} Oct 08 14:29:05 crc kubenswrapper[4735]: I1008 14:29:05.190457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4102-account-create-pb929" event={"ID":"5b895276-2adc-4f16-8a1f-04ed071342e7","Type":"ContainerStarted","Data":"20ca52c376ff17f4d490ccf4e02f40598a6b22ad5077ca918c72f1df285f2d87"} Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.548842 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.647945 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.654151 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.705322 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m695g\" (UniqueName: \"kubernetes.io/projected/8728623f-2ab6-4156-b915-a376fe953042-kube-api-access-m695g\") pod \"8728623f-2ab6-4156-b915-a376fe953042\" (UID: \"8728623f-2ab6-4156-b915-a376fe953042\") " Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.713329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8728623f-2ab6-4156-b915-a376fe953042-kube-api-access-m695g" (OuterVolumeSpecName: "kube-api-access-m695g") pod "8728623f-2ab6-4156-b915-a376fe953042" (UID: "8728623f-2ab6-4156-b915-a376fe953042"). InnerVolumeSpecName "kube-api-access-m695g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.806740 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwnml\" (UniqueName: \"kubernetes.io/projected/02e5378c-61f6-43b9-a69b-0a2004e44291-kube-api-access-jwnml\") pod \"02e5378c-61f6-43b9-a69b-0a2004e44291\" (UID: \"02e5378c-61f6-43b9-a69b-0a2004e44291\") " Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.807234 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmwd4\" (UniqueName: \"kubernetes.io/projected/5b895276-2adc-4f16-8a1f-04ed071342e7-kube-api-access-jmwd4\") pod \"5b895276-2adc-4f16-8a1f-04ed071342e7\" (UID: \"5b895276-2adc-4f16-8a1f-04ed071342e7\") " Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.807837 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m695g\" (UniqueName: \"kubernetes.io/projected/8728623f-2ab6-4156-b915-a376fe953042-kube-api-access-m695g\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.809957 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e5378c-61f6-43b9-a69b-0a2004e44291-kube-api-access-jwnml" (OuterVolumeSpecName: "kube-api-access-jwnml") pod "02e5378c-61f6-43b9-a69b-0a2004e44291" (UID: "02e5378c-61f6-43b9-a69b-0a2004e44291"). InnerVolumeSpecName "kube-api-access-jwnml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.810046 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b895276-2adc-4f16-8a1f-04ed071342e7-kube-api-access-jmwd4" (OuterVolumeSpecName: "kube-api-access-jmwd4") pod "5b895276-2adc-4f16-8a1f-04ed071342e7" (UID: "5b895276-2adc-4f16-8a1f-04ed071342e7"). InnerVolumeSpecName "kube-api-access-jmwd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.909722 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmwd4\" (UniqueName: \"kubernetes.io/projected/5b895276-2adc-4f16-8a1f-04ed071342e7-kube-api-access-jmwd4\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:06 crc kubenswrapper[4735]: I1008 14:29:06.910267 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwnml\" (UniqueName: \"kubernetes.io/projected/02e5378c-61f6-43b9-a69b-0a2004e44291-kube-api-access-jwnml\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.209797 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.209875 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f8ac-account-create-x5vzj" event={"ID":"02e5378c-61f6-43b9-a69b-0a2004e44291","Type":"ContainerDied","Data":"334e7db70c35981acf22d080e7d6c8891d3f60e9fa857924960ac11348506567"} Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.210153 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334e7db70c35981acf22d080e7d6c8891d3f60e9fa857924960ac11348506567" Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.212419 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4102-account-create-pb929" Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.212407 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4102-account-create-pb929" event={"ID":"5b895276-2adc-4f16-8a1f-04ed071342e7","Type":"ContainerDied","Data":"20ca52c376ff17f4d490ccf4e02f40598a6b22ad5077ca918c72f1df285f2d87"} Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.212656 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ca52c376ff17f4d490ccf4e02f40598a6b22ad5077ca918c72f1df285f2d87" Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.214014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" event={"ID":"8728623f-2ab6-4156-b915-a376fe953042","Type":"ContainerDied","Data":"11b5acb000d700588afe5c060fc91555bdb9f1370c60951a080571f40fceb79a"} Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.214037 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b5acb000d700588afe5c060fc91555bdb9f1370c60951a080571f40fceb79a" Oct 08 14:29:07 crc kubenswrapper[4735]: I1008 14:29:07.214151 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b6c-account-create-wrs4w" Oct 08 14:29:08 crc kubenswrapper[4735]: I1008 14:29:08.513046 4735 scope.go:117] "RemoveContainer" containerID="e4611f988abc1ce086cc4ac477391d8ab29b6b87fb214bae70fb6d991cdaf321" Oct 08 14:29:08 crc kubenswrapper[4735]: I1008 14:29:08.570841 4735 scope.go:117] "RemoveContainer" containerID="62f718a1ae614b628fd716a84d297856ce4cdae04cd5c92fba6be937fc144b52" Oct 08 14:29:08 crc kubenswrapper[4735]: I1008 14:29:08.587516 4735 scope.go:117] "RemoveContainer" containerID="c013178bb7d6c448b65afc7594c9fe14dfee395542fe7307d13cff538a917934" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.143946 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8gzl"] Oct 08 14:29:09 crc kubenswrapper[4735]: E1008 14:29:09.149505 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b895276-2adc-4f16-8a1f-04ed071342e7" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.149538 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b895276-2adc-4f16-8a1f-04ed071342e7" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: E1008 14:29:09.149575 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e5378c-61f6-43b9-a69b-0a2004e44291" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.149581 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e5378c-61f6-43b9-a69b-0a2004e44291" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: E1008 14:29:09.149592 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8728623f-2ab6-4156-b915-a376fe953042" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.149597 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8728623f-2ab6-4156-b915-a376fe953042" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.149853 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8728623f-2ab6-4156-b915-a376fe953042" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.149870 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e5378c-61f6-43b9-a69b-0a2004e44291" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.149888 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b895276-2adc-4f16-8a1f-04ed071342e7" containerName="mariadb-account-create" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.150587 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.155673 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nr4gc" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.155690 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.156195 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.176098 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8gzl"] Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.252693 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdfl\" (UniqueName: \"kubernetes.io/projected/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-kube-api-access-csdfl\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.252789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-config-data\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.252815 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-scripts\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.252838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.354304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdfl\" (UniqueName: \"kubernetes.io/projected/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-kube-api-access-csdfl\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.354631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-config-data\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.354738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-scripts\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.354820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.360293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.363052 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-scripts\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.367958 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-config-data\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.370613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdfl\" (UniqueName: \"kubernetes.io/projected/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-kube-api-access-csdfl\") pod \"nova-cell0-conductor-db-sync-p8gzl\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.474860 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:09 crc kubenswrapper[4735]: W1008 14:29:09.914610 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc2f1bfd_72c0_4de9_a29e_a8258afbbf89.slice/crio-ea11f39b160d6d3caebfaf87e0196e7869ebf54c6e255d7cafb945f67a6d86c3 WatchSource:0}: Error finding container ea11f39b160d6d3caebfaf87e0196e7869ebf54c6e255d7cafb945f67a6d86c3: Status 404 returned error can't find the container with id ea11f39b160d6d3caebfaf87e0196e7869ebf54c6e255d7cafb945f67a6d86c3 Oct 08 14:29:09 crc kubenswrapper[4735]: I1008 14:29:09.917962 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8gzl"] Oct 08 14:29:10 crc kubenswrapper[4735]: I1008 14:29:10.241579 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" event={"ID":"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89","Type":"ContainerStarted","Data":"d6d14aa186181136687c60f46a9cce968c91550db2b7d4f2f3fc62a96587f37d"} Oct 08 14:29:10 crc kubenswrapper[4735]: I1008 14:29:10.241916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" event={"ID":"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89","Type":"ContainerStarted","Data":"ea11f39b160d6d3caebfaf87e0196e7869ebf54c6e255d7cafb945f67a6d86c3"} Oct 08 14:29:10 crc kubenswrapper[4735]: I1008 14:29:10.266209 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" podStartSLOduration=1.266190558 podStartE2EDuration="1.266190558s" podCreationTimestamp="2025-10-08 14:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:10.260126702 +0000 UTC m=+5600.068431246" watchObservedRunningTime="2025-10-08 14:29:10.266190558 +0000 UTC m=+5600.074495102" Oct 08 14:29:16 crc kubenswrapper[4735]: I1008 14:29:16.305782 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" containerID="d6d14aa186181136687c60f46a9cce968c91550db2b7d4f2f3fc62a96587f37d" exitCode=0 Oct 08 14:29:16 crc kubenswrapper[4735]: I1008 14:29:16.305839 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" event={"ID":"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89","Type":"ContainerDied","Data":"d6d14aa186181136687c60f46a9cce968c91550db2b7d4f2f3fc62a96587f37d"} Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.618949 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.719329 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdfl\" (UniqueName: \"kubernetes.io/projected/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-kube-api-access-csdfl\") pod \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.719438 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-combined-ca-bundle\") pod \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.719615 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-config-data\") pod \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.719786 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-scripts\") pod \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\" (UID: \"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89\") " Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.725674 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-scripts" (OuterVolumeSpecName: "scripts") pod "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" (UID: "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.726814 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-kube-api-access-csdfl" (OuterVolumeSpecName: "kube-api-access-csdfl") pod "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" (UID: "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89"). InnerVolumeSpecName "kube-api-access-csdfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.748857 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" (UID: "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.755684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-config-data" (OuterVolumeSpecName: "config-data") pod "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" (UID: "bc2f1bfd-72c0-4de9-a29e-a8258afbbf89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.821921 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.821963 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdfl\" (UniqueName: \"kubernetes.io/projected/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-kube-api-access-csdfl\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.821980 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:17 crc kubenswrapper[4735]: I1008 14:29:17.821992 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.323720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" event={"ID":"bc2f1bfd-72c0-4de9-a29e-a8258afbbf89","Type":"ContainerDied","Data":"ea11f39b160d6d3caebfaf87e0196e7869ebf54c6e255d7cafb945f67a6d86c3"} Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.323768 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea11f39b160d6d3caebfaf87e0196e7869ebf54c6e255d7cafb945f67a6d86c3" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.323802 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8gzl" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.397421 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:29:18 crc kubenswrapper[4735]: E1008 14:29:18.397775 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" containerName="nova-cell0-conductor-db-sync" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.397794 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" containerName="nova-cell0-conductor-db-sync" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.397973 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" containerName="nova-cell0-conductor-db-sync" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.398857 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.401167 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nr4gc" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.402114 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.406634 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.534405 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.534760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/ba01536e-ef13-4daf-a7a2-8405ee0bef12-kube-api-access-wdknf\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.534859 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.636261 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.636334 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/ba01536e-ef13-4daf-a7a2-8405ee0bef12-kube-api-access-wdknf\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.636367 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.642036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.642101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.654051 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/ba01536e-ef13-4daf-a7a2-8405ee0bef12-kube-api-access-wdknf\") pod \"nova-cell0-conductor-0\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:18 crc kubenswrapper[4735]: I1008 14:29:18.716975 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:19 crc kubenswrapper[4735]: I1008 14:29:19.155074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:29:19 crc kubenswrapper[4735]: W1008 14:29:19.162684 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba01536e_ef13_4daf_a7a2_8405ee0bef12.slice/crio-3c2dd9a81c0e4ba67036d5009fceb6ed99477dd1b803877f02c63b341d2243b4 WatchSource:0}: Error finding container 3c2dd9a81c0e4ba67036d5009fceb6ed99477dd1b803877f02c63b341d2243b4: Status 404 returned error can't find the container with id 3c2dd9a81c0e4ba67036d5009fceb6ed99477dd1b803877f02c63b341d2243b4 Oct 08 14:29:19 crc kubenswrapper[4735]: I1008 14:29:19.331919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba01536e-ef13-4daf-a7a2-8405ee0bef12","Type":"ContainerStarted","Data":"3c2dd9a81c0e4ba67036d5009fceb6ed99477dd1b803877f02c63b341d2243b4"} Oct 08 14:29:20 crc kubenswrapper[4735]: I1008 14:29:20.342296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba01536e-ef13-4daf-a7a2-8405ee0bef12","Type":"ContainerStarted","Data":"66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452"} Oct 08 14:29:20 crc kubenswrapper[4735]: I1008 14:29:20.342736 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:20 crc kubenswrapper[4735]: I1008 14:29:20.372576 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.372561697 podStartE2EDuration="2.372561697s" podCreationTimestamp="2025-10-08 14:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:20.365021904 +0000 UTC m=+5610.173326448" watchObservedRunningTime="2025-10-08 14:29:20.372561697 +0000 UTC m=+5610.180866241" Oct 08 14:29:28 crc kubenswrapper[4735]: I1008 14:29:28.760229 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.229441 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4s8fp"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.231662 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.239334 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4s8fp"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.317627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6454p\" (UniqueName: \"kubernetes.io/projected/d4b43223-9847-4b56-b7f7-6a3062f4fb30-kube-api-access-6454p\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.317674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-catalog-content\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.317705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-utilities\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.412645 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-j9m8t"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.414052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.419079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.419283 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.420301 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6454p\" (UniqueName: \"kubernetes.io/projected/d4b43223-9847-4b56-b7f7-6a3062f4fb30-kube-api-access-6454p\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.420340 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-catalog-content\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.420375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-utilities\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.420936 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-catalog-content\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.420981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-utilities\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.426646 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-j9m8t"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.477924 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6454p\" (UniqueName: \"kubernetes.io/projected/d4b43223-9847-4b56-b7f7-6a3062f4fb30-kube-api-access-6454p\") pod \"certified-operators-4s8fp\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.528371 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-scripts\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.528433 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.528511 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-config-data\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.528577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbdf\" (UniqueName: \"kubernetes.io/projected/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-kube-api-access-zrbdf\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.561890 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.631376 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.631448 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-config-data\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.631505 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbdf\" (UniqueName: \"kubernetes.io/projected/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-kube-api-access-zrbdf\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.631582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-scripts\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.647575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.651219 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.652088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-scripts\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.659225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-config-data\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.659630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.667681 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.687935 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbdf\" (UniqueName: \"kubernetes.io/projected/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-kube-api-access-zrbdf\") pod \"nova-cell0-cell-mapping-j9m8t\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.689205 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.733864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-config-data\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.733909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec2fb996-87c8-4069-a4d0-af404918cda2-logs\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.733962 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.733995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6lgg\" (UniqueName: \"kubernetes.io/projected/ec2fb996-87c8-4069-a4d0-af404918cda2-kube-api-access-p6lgg\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.750982 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.773544 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.774717 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.776713 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.780291 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.781985 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.798574 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.806230 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.814928 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836646 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtkvc\" (UniqueName: \"kubernetes.io/projected/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-kube-api-access-jtkvc\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-config-data\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836692 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-config-data\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6lgg\" (UniqueName: \"kubernetes.io/projected/ec2fb996-87c8-4069-a4d0-af404918cda2-kube-api-access-p6lgg\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836794 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-logs\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t2rh\" (UniqueName: \"kubernetes.io/projected/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-kube-api-access-8t2rh\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836926 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-config-data\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec2fb996-87c8-4069-a4d0-af404918cda2-logs\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.836990 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.848276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec2fb996-87c8-4069-a4d0-af404918cda2-logs\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.848734 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-config-data\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.879343 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.889107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6lgg\" (UniqueName: \"kubernetes.io/projected/ec2fb996-87c8-4069-a4d0-af404918cda2-kube-api-access-p6lgg\") pod \"nova-api-0\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " pod="openstack/nova-api-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.938697 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.945629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtkvc\" (UniqueName: \"kubernetes.io/projected/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-kube-api-access-jtkvc\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.959372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-config-data\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.959643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.959787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-config-data\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.959988 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-logs\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.960646 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t2rh\" (UniqueName: \"kubernetes.io/projected/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-kube-api-access-8t2rh\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.943704 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb999bbc-tspnr"] Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.971220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-logs\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.975846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.978136 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.982736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-config-data\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.982785 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-config-data\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.982842 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.984965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtkvc\" (UniqueName: \"kubernetes.io/projected/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-kube-api-access-jtkvc\") pod \"nova-metadata-0\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " pod="openstack/nova-metadata-0" Oct 08 14:29:29 crc kubenswrapper[4735]: I1008 14:29:29.987815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t2rh\" (UniqueName: \"kubernetes.io/projected/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-kube-api-access-8t2rh\") pod \"nova-scheduler-0\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.070589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.077450 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.086304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.086326 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-config\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.086456 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smpmv\" (UniqueName: \"kubernetes.io/projected/ba27f8b5-fea5-4552-b5ce-40415d883627-kube-api-access-smpmv\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.086701 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-dns-svc\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.105662 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.106323 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb999bbc-tspnr"] Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.115321 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.127269 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.129468 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.178899 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4s8fp"] Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190673 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190758 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-dns-svc\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190799 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzl5t\" (UniqueName: \"kubernetes.io/projected/64032e23-f67c-4a29-85b3-ccbc2d2145f0-kube-api-access-bzl5t\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190864 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190882 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-config\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smpmv\" (UniqueName: \"kubernetes.io/projected/ba27f8b5-fea5-4552-b5ce-40415d883627-kube-api-access-smpmv\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.190955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.191883 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-dns-svc\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.192299 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.192542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-config\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.192990 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.226894 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smpmv\" (UniqueName: \"kubernetes.io/projected/ba27f8b5-fea5-4552-b5ce-40415d883627-kube-api-access-smpmv\") pod \"dnsmasq-dns-75bb999bbc-tspnr\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.240499 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.285121 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.294511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzl5t\" (UniqueName: \"kubernetes.io/projected/64032e23-f67c-4a29-85b3-ccbc2d2145f0-kube-api-access-bzl5t\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.294653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.294715 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.298950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.309286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.320463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzl5t\" (UniqueName: \"kubernetes.io/projected/64032e23-f67c-4a29-85b3-ccbc2d2145f0-kube-api-access-bzl5t\") pod \"nova-cell1-novncproxy-0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.352894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.431955 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.467289 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8fp" event={"ID":"d4b43223-9847-4b56-b7f7-6a3062f4fb30","Type":"ContainerStarted","Data":"71b8f693be6728fe4ad0d0cb0503eaa337851d9bcd7e30e068672d5b63292147"} Oct 08 14:29:30 crc kubenswrapper[4735]: I1008 14:29:30.516235 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-j9m8t"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:30.823208 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.376819 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hvjqf"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.378996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.381462 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.391237 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.392175 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hvjqf"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.492042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec2fb996-87c8-4069-a4d0-af404918cda2","Type":"ContainerStarted","Data":"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02"} Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.492092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec2fb996-87c8-4069-a4d0-af404918cda2","Type":"ContainerStarted","Data":"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a"} Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.492104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec2fb996-87c8-4069-a4d0-af404918cda2","Type":"ContainerStarted","Data":"9ca0e080516bb30b5df4b5e6b7d392649d9acf10e5cd1005907847950c08dcdc"} Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.499042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j9m8t" event={"ID":"4cd80f17-b1da-4eaa-a962-01cb94a5abe3","Type":"ContainerStarted","Data":"55ef97d3905a3568ca8237e879eca8c1eca6f098eeefb397bdbe6c5cc8a29e56"} Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.499076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j9m8t" event={"ID":"4cd80f17-b1da-4eaa-a962-01cb94a5abe3","Type":"ContainerStarted","Data":"a7bc4db0a7a063f3efef04f6d3aa86f318e6d36847019200e681de959a42adc5"} Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.502304 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerID="f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537" exitCode=0 Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.502334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8fp" event={"ID":"d4b43223-9847-4b56-b7f7-6a3062f4fb30","Type":"ContainerDied","Data":"f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537"} Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.516667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q82sq\" (UniqueName: \"kubernetes.io/projected/6025ac38-ff1c-464a-84a4-75dfabd4bb94-kube-api-access-q82sq\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.516748 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-scripts\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.516806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-config-data\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.516906 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.521197 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.521173105 podStartE2EDuration="2.521173105s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:31.516079705 +0000 UTC m=+5621.324384239" watchObservedRunningTime="2025-10-08 14:29:31.521173105 +0000 UTC m=+5621.329477649" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.545530 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-j9m8t" podStartSLOduration=2.545509098 podStartE2EDuration="2.545509098s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:31.541503356 +0000 UTC m=+5621.349807920" watchObservedRunningTime="2025-10-08 14:29:31.545509098 +0000 UTC m=+5621.353813652" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.618123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-config-data\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.618447 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.618613 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q82sq\" (UniqueName: \"kubernetes.io/projected/6025ac38-ff1c-464a-84a4-75dfabd4bb94-kube-api-access-q82sq\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.619716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-scripts\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.627599 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-config-data\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.637950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-scripts\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.645715 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q82sq\" (UniqueName: \"kubernetes.io/projected/6025ac38-ff1c-464a-84a4-75dfabd4bb94-kube-api-access-q82sq\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.646585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hvjqf\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.656542 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb999bbc-tspnr"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.682873 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.707925 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.849028 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:31 crc kubenswrapper[4735]: I1008 14:29:31.860068 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.235347 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hvjqf"] Oct 08 14:29:32 crc kubenswrapper[4735]: W1008 14:29:32.235530 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6025ac38_ff1c_464a_84a4_75dfabd4bb94.slice/crio-4b17324e4e9a159e8b135a228a7509c0f1541dfd9a81688bb2e077afcbfb010c WatchSource:0}: Error finding container 4b17324e4e9a159e8b135a228a7509c0f1541dfd9a81688bb2e077afcbfb010c: Status 404 returned error can't find the container with id 4b17324e4e9a159e8b135a228a7509c0f1541dfd9a81688bb2e077afcbfb010c Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.580846 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2893a8bf-9eeb-4f6c-bde3-a798e72934fc","Type":"ContainerStarted","Data":"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.581038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2893a8bf-9eeb-4f6c-bde3-a798e72934fc","Type":"ContainerStarted","Data":"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.581054 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2893a8bf-9eeb-4f6c-bde3-a798e72934fc","Type":"ContainerStarted","Data":"997bfc0e8d3dd5798976ec15d1ac47a89c8b6713909547bdf4fd2146a572a59b"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.600949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451","Type":"ContainerStarted","Data":"bb387538298d15636b8e064921a71f471b4f2e5911f8af470e6649138226a71a"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.601017 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451","Type":"ContainerStarted","Data":"2e310303f7388214582368bdc4b46dea0281c0b7270b59edbeafe1b0e8683053"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.613365 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" event={"ID":"6025ac38-ff1c-464a-84a4-75dfabd4bb94","Type":"ContainerStarted","Data":"c2e7af3ee59425c649664e213c8bbc570af803ac6017912034e3330595461a1c"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.613413 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" event={"ID":"6025ac38-ff1c-464a-84a4-75dfabd4bb94","Type":"ContainerStarted","Data":"4b17324e4e9a159e8b135a228a7509c0f1541dfd9a81688bb2e077afcbfb010c"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.626726 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64032e23-f67c-4a29-85b3-ccbc2d2145f0","Type":"ContainerStarted","Data":"c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.626768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64032e23-f67c-4a29-85b3-ccbc2d2145f0","Type":"ContainerStarted","Data":"2d42b6c84d58a83b723bc0114b0cfefad111f15505dab5a6f963fd4d529e9aa3"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.652805 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerID="0267607a538f33574467b59b6c1ee4cfcd59808229a1ac60dc6472b3c256316e" exitCode=0 Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.653550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" event={"ID":"ba27f8b5-fea5-4552-b5ce-40415d883627","Type":"ContainerDied","Data":"0267607a538f33574467b59b6c1ee4cfcd59808229a1ac60dc6472b3c256316e"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.653617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" event={"ID":"ba27f8b5-fea5-4552-b5ce-40415d883627","Type":"ContainerStarted","Data":"e254c8ab4cba14d447631690a5ad1226a032eb72a0e7ad0ee1e3905417a67300"} Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.735366 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.735336817 podStartE2EDuration="3.735336817s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:32.621154923 +0000 UTC m=+5622.429459467" watchObservedRunningTime="2025-10-08 14:29:32.735336817 +0000 UTC m=+5622.543641361" Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.808381 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.808360597 podStartE2EDuration="3.808360597s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:32.645030944 +0000 UTC m=+5622.453335488" watchObservedRunningTime="2025-10-08 14:29:32.808360597 +0000 UTC m=+5622.616665131" Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.814955 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" podStartSLOduration=1.814936886 podStartE2EDuration="1.814936886s" podCreationTimestamp="2025-10-08 14:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:32.695391004 +0000 UTC m=+5622.503695568" watchObservedRunningTime="2025-10-08 14:29:32.814936886 +0000 UTC m=+5622.623241430" Oct 08 14:29:32 crc kubenswrapper[4735]: I1008 14:29:32.842419 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.842366028 podStartE2EDuration="3.842366028s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:32.756418667 +0000 UTC m=+5622.564723201" watchObservedRunningTime="2025-10-08 14:29:32.842366028 +0000 UTC m=+5622.650670572" Oct 08 14:29:33 crc kubenswrapper[4735]: I1008 14:29:33.667088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" event={"ID":"ba27f8b5-fea5-4552-b5ce-40415d883627","Type":"ContainerStarted","Data":"011af5768c65232d1025dd12d7b7598b8a636dfe23290854dae0231044287aa2"} Oct 08 14:29:33 crc kubenswrapper[4735]: I1008 14:29:33.668708 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:33 crc kubenswrapper[4735]: I1008 14:29:33.671047 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerID="ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908" exitCode=0 Oct 08 14:29:33 crc kubenswrapper[4735]: I1008 14:29:33.671931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8fp" event={"ID":"d4b43223-9847-4b56-b7f7-6a3062f4fb30","Type":"ContainerDied","Data":"ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908"} Oct 08 14:29:33 crc kubenswrapper[4735]: I1008 14:29:33.699986 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" podStartSLOduration=4.699961478 podStartE2EDuration="4.699961478s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:33.69691174 +0000 UTC m=+5623.505216284" watchObservedRunningTime="2025-10-08 14:29:33.699961478 +0000 UTC m=+5623.508266022" Oct 08 14:29:34 crc kubenswrapper[4735]: I1008 14:29:34.683804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8fp" event={"ID":"d4b43223-9847-4b56-b7f7-6a3062f4fb30","Type":"ContainerStarted","Data":"1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d"} Oct 08 14:29:34 crc kubenswrapper[4735]: I1008 14:29:34.709083 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4s8fp" podStartSLOduration=2.9028086379999998 podStartE2EDuration="5.709061359s" podCreationTimestamp="2025-10-08 14:29:29 +0000 UTC" firstStartedPulling="2025-10-08 14:29:31.503894803 +0000 UTC m=+5621.312199347" lastFinishedPulling="2025-10-08 14:29:34.310147524 +0000 UTC m=+5624.118452068" observedRunningTime="2025-10-08 14:29:34.70170036 +0000 UTC m=+5624.510004914" watchObservedRunningTime="2025-10-08 14:29:34.709061359 +0000 UTC m=+5624.517365903" Oct 08 14:29:35 crc kubenswrapper[4735]: I1008 14:29:35.241112 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 14:29:35 crc kubenswrapper[4735]: I1008 14:29:35.286947 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:29:35 crc kubenswrapper[4735]: I1008 14:29:35.287074 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:29:35 crc kubenswrapper[4735]: I1008 14:29:35.433255 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:35 crc kubenswrapper[4735]: I1008 14:29:35.692839 4735 generic.go:334] "Generic (PLEG): container finished" podID="6025ac38-ff1c-464a-84a4-75dfabd4bb94" containerID="c2e7af3ee59425c649664e213c8bbc570af803ac6017912034e3330595461a1c" exitCode=0 Oct 08 14:29:35 crc kubenswrapper[4735]: I1008 14:29:35.692961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" event={"ID":"6025ac38-ff1c-464a-84a4-75dfabd4bb94","Type":"ContainerDied","Data":"c2e7af3ee59425c649664e213c8bbc570af803ac6017912034e3330595461a1c"} Oct 08 14:29:36 crc kubenswrapper[4735]: I1008 14:29:36.711227 4735 generic.go:334] "Generic (PLEG): container finished" podID="4cd80f17-b1da-4eaa-a962-01cb94a5abe3" containerID="55ef97d3905a3568ca8237e879eca8c1eca6f098eeefb397bdbe6c5cc8a29e56" exitCode=0 Oct 08 14:29:36 crc kubenswrapper[4735]: I1008 14:29:36.711309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j9m8t" event={"ID":"4cd80f17-b1da-4eaa-a962-01cb94a5abe3","Type":"ContainerDied","Data":"55ef97d3905a3568ca8237e879eca8c1eca6f098eeefb397bdbe6c5cc8a29e56"} Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.155727 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.243063 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q82sq\" (UniqueName: \"kubernetes.io/projected/6025ac38-ff1c-464a-84a4-75dfabd4bb94-kube-api-access-q82sq\") pod \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.243146 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-combined-ca-bundle\") pod \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.243178 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-scripts\") pod \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.243251 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-config-data\") pod \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\" (UID: \"6025ac38-ff1c-464a-84a4-75dfabd4bb94\") " Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.248509 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6025ac38-ff1c-464a-84a4-75dfabd4bb94-kube-api-access-q82sq" (OuterVolumeSpecName: "kube-api-access-q82sq") pod "6025ac38-ff1c-464a-84a4-75dfabd4bb94" (UID: "6025ac38-ff1c-464a-84a4-75dfabd4bb94"). InnerVolumeSpecName "kube-api-access-q82sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.248631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-scripts" (OuterVolumeSpecName: "scripts") pod "6025ac38-ff1c-464a-84a4-75dfabd4bb94" (UID: "6025ac38-ff1c-464a-84a4-75dfabd4bb94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.268293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-config-data" (OuterVolumeSpecName: "config-data") pod "6025ac38-ff1c-464a-84a4-75dfabd4bb94" (UID: "6025ac38-ff1c-464a-84a4-75dfabd4bb94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.269440 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6025ac38-ff1c-464a-84a4-75dfabd4bb94" (UID: "6025ac38-ff1c-464a-84a4-75dfabd4bb94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.345078 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q82sq\" (UniqueName: \"kubernetes.io/projected/6025ac38-ff1c-464a-84a4-75dfabd4bb94-kube-api-access-q82sq\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.345113 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.345122 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.345131 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6025ac38-ff1c-464a-84a4-75dfabd4bb94-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.722093 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" event={"ID":"6025ac38-ff1c-464a-84a4-75dfabd4bb94","Type":"ContainerDied","Data":"4b17324e4e9a159e8b135a228a7509c0f1541dfd9a81688bb2e077afcbfb010c"} Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.722150 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b17324e4e9a159e8b135a228a7509c0f1541dfd9a81688bb2e077afcbfb010c" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.722120 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hvjqf" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.794424 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:29:37 crc kubenswrapper[4735]: E1008 14:29:37.795329 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6025ac38-ff1c-464a-84a4-75dfabd4bb94" containerName="nova-cell1-conductor-db-sync" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.795356 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6025ac38-ff1c-464a-84a4-75dfabd4bb94" containerName="nova-cell1-conductor-db-sync" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.795600 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6025ac38-ff1c-464a-84a4-75dfabd4bb94" containerName="nova-cell1-conductor-db-sync" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.796419 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.798942 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.805234 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.854749 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.854809 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.854872 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng7hz\" (UniqueName: \"kubernetes.io/projected/0c24bffc-5d3a-485a-8358-23a30149782b-kube-api-access-ng7hz\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.957034 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.957131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.957217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7hz\" (UniqueName: \"kubernetes.io/projected/0c24bffc-5d3a-485a-8358-23a30149782b-kube-api-access-ng7hz\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.966253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.966781 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:37 crc kubenswrapper[4735]: I1008 14:29:37.973341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng7hz\" (UniqueName: \"kubernetes.io/projected/0c24bffc-5d3a-485a-8358-23a30149782b-kube-api-access-ng7hz\") pod \"nova-cell1-conductor-0\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.022150 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.115738 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.160467 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbdf\" (UniqueName: \"kubernetes.io/projected/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-kube-api-access-zrbdf\") pod \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.160590 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-scripts\") pod \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.160747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-config-data\") pod \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.160777 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-combined-ca-bundle\") pod \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\" (UID: \"4cd80f17-b1da-4eaa-a962-01cb94a5abe3\") " Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.164951 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-scripts" (OuterVolumeSpecName: "scripts") pod "4cd80f17-b1da-4eaa-a962-01cb94a5abe3" (UID: "4cd80f17-b1da-4eaa-a962-01cb94a5abe3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.165739 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-kube-api-access-zrbdf" (OuterVolumeSpecName: "kube-api-access-zrbdf") pod "4cd80f17-b1da-4eaa-a962-01cb94a5abe3" (UID: "4cd80f17-b1da-4eaa-a962-01cb94a5abe3"). InnerVolumeSpecName "kube-api-access-zrbdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.202395 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cd80f17-b1da-4eaa-a962-01cb94a5abe3" (UID: "4cd80f17-b1da-4eaa-a962-01cb94a5abe3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.212294 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-config-data" (OuterVolumeSpecName: "config-data") pod "4cd80f17-b1da-4eaa-a962-01cb94a5abe3" (UID: "4cd80f17-b1da-4eaa-a962-01cb94a5abe3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.262883 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrbdf\" (UniqueName: \"kubernetes.io/projected/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-kube-api-access-zrbdf\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.262916 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.262925 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.262934 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd80f17-b1da-4eaa-a962-01cb94a5abe3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.544828 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.753975 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j9m8t" event={"ID":"4cd80f17-b1da-4eaa-a962-01cb94a5abe3","Type":"ContainerDied","Data":"a7bc4db0a7a063f3efef04f6d3aa86f318e6d36847019200e681de959a42adc5"} Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.754173 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7bc4db0a7a063f3efef04f6d3aa86f318e6d36847019200e681de959a42adc5" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.754257 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j9m8t" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.774138 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c24bffc-5d3a-485a-8358-23a30149782b","Type":"ContainerStarted","Data":"400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7"} Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.774193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c24bffc-5d3a-485a-8358-23a30149782b","Type":"ContainerStarted","Data":"83321369e60e7732f340945c3f9c466f0901fc468169b58e914257f6e57f0c5c"} Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.776189 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.801553 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.801530337 podStartE2EDuration="1.801530337s" podCreationTimestamp="2025-10-08 14:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:38.789347115 +0000 UTC m=+5628.597651659" watchObservedRunningTime="2025-10-08 14:29:38.801530337 +0000 UTC m=+5628.609834881" Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.944086 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.944369 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" containerName="nova-scheduler-scheduler" containerID="cri-o://bb387538298d15636b8e064921a71f471b4f2e5911f8af470e6649138226a71a" gracePeriod=30 Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.972592 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.972863 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-log" containerID="cri-o://a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a" gracePeriod=30 Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.973283 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-api" containerID="cri-o://7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02" gracePeriod=30 Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.980537 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.981597 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-metadata" containerID="cri-o://962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6" gracePeriod=30 Oct 08 14:29:38 crc kubenswrapper[4735]: I1008 14:29:38.980737 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-log" containerID="cri-o://212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6" gracePeriod=30 Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.563963 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.564037 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.617622 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.623755 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.628670 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721242 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-config-data\") pod \"ec2fb996-87c8-4069-a4d0-af404918cda2\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721303 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-logs\") pod \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-config-data\") pod \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721447 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6lgg\" (UniqueName: \"kubernetes.io/projected/ec2fb996-87c8-4069-a4d0-af404918cda2-kube-api-access-p6lgg\") pod \"ec2fb996-87c8-4069-a4d0-af404918cda2\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-combined-ca-bundle\") pod \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721571 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtkvc\" (UniqueName: \"kubernetes.io/projected/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-kube-api-access-jtkvc\") pod \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\" (UID: \"2893a8bf-9eeb-4f6c-bde3-a798e72934fc\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-combined-ca-bundle\") pod \"ec2fb996-87c8-4069-a4d0-af404918cda2\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721654 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec2fb996-87c8-4069-a4d0-af404918cda2-logs\") pod \"ec2fb996-87c8-4069-a4d0-af404918cda2\" (UID: \"ec2fb996-87c8-4069-a4d0-af404918cda2\") " Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.721652 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-logs" (OuterVolumeSpecName: "logs") pod "2893a8bf-9eeb-4f6c-bde3-a798e72934fc" (UID: "2893a8bf-9eeb-4f6c-bde3-a798e72934fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.722234 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec2fb996-87c8-4069-a4d0-af404918cda2-logs" (OuterVolumeSpecName: "logs") pod "ec2fb996-87c8-4069-a4d0-af404918cda2" (UID: "ec2fb996-87c8-4069-a4d0-af404918cda2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.727486 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-kube-api-access-jtkvc" (OuterVolumeSpecName: "kube-api-access-jtkvc") pod "2893a8bf-9eeb-4f6c-bde3-a798e72934fc" (UID: "2893a8bf-9eeb-4f6c-bde3-a798e72934fc"). InnerVolumeSpecName "kube-api-access-jtkvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.728162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2fb996-87c8-4069-a4d0-af404918cda2-kube-api-access-p6lgg" (OuterVolumeSpecName: "kube-api-access-p6lgg") pod "ec2fb996-87c8-4069-a4d0-af404918cda2" (UID: "ec2fb996-87c8-4069-a4d0-af404918cda2"). InnerVolumeSpecName "kube-api-access-p6lgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.753863 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec2fb996-87c8-4069-a4d0-af404918cda2" (UID: "ec2fb996-87c8-4069-a4d0-af404918cda2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.753969 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2893a8bf-9eeb-4f6c-bde3-a798e72934fc" (UID: "2893a8bf-9eeb-4f6c-bde3-a798e72934fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.756683 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-config-data" (OuterVolumeSpecName: "config-data") pod "2893a8bf-9eeb-4f6c-bde3-a798e72934fc" (UID: "2893a8bf-9eeb-4f6c-bde3-a798e72934fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.762866 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-config-data" (OuterVolumeSpecName: "config-data") pod "ec2fb996-87c8-4069-a4d0-af404918cda2" (UID: "ec2fb996-87c8-4069-a4d0-af404918cda2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787319 4735 generic.go:334] "Generic (PLEG): container finished" podID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerID="962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6" exitCode=0 Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787352 4735 generic.go:334] "Generic (PLEG): container finished" podID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerID="212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6" exitCode=143 Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2893a8bf-9eeb-4f6c-bde3-a798e72934fc","Type":"ContainerDied","Data":"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2893a8bf-9eeb-4f6c-bde3-a798e72934fc","Type":"ContainerDied","Data":"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787436 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2893a8bf-9eeb-4f6c-bde3-a798e72934fc","Type":"ContainerDied","Data":"997bfc0e8d3dd5798976ec15d1ac47a89c8b6713909547bdf4fd2146a572a59b"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787456 4735 scope.go:117] "RemoveContainer" containerID="962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.787625 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.795165 4735 generic.go:334] "Generic (PLEG): container finished" podID="4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" containerID="bb387538298d15636b8e064921a71f471b4f2e5911f8af470e6649138226a71a" exitCode=0 Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.795237 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451","Type":"ContainerDied","Data":"bb387538298d15636b8e064921a71f471b4f2e5911f8af470e6649138226a71a"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.798532 4735 generic.go:334] "Generic (PLEG): container finished" podID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerID="7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02" exitCode=0 Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.798556 4735 generic.go:334] "Generic (PLEG): container finished" podID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerID="a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a" exitCode=143 Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.798611 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec2fb996-87c8-4069-a4d0-af404918cda2","Type":"ContainerDied","Data":"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.798681 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec2fb996-87c8-4069-a4d0-af404918cda2","Type":"ContainerDied","Data":"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.798699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec2fb996-87c8-4069-a4d0-af404918cda2","Type":"ContainerDied","Data":"9ca0e080516bb30b5df4b5e6b7d392649d9acf10e5cd1005907847950c08dcdc"} Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.798969 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.823981 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec2fb996-87c8-4069-a4d0-af404918cda2-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824019 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824035 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824048 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824064 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6lgg\" (UniqueName: \"kubernetes.io/projected/ec2fb996-87c8-4069-a4d0-af404918cda2-kube-api-access-p6lgg\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824077 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824093 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtkvc\" (UniqueName: \"kubernetes.io/projected/2893a8bf-9eeb-4f6c-bde3-a798e72934fc-kube-api-access-jtkvc\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.824106 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2fb996-87c8-4069-a4d0-af404918cda2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.827764 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.839363 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.848660 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.858512 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.868200 4735 scope.go:117] "RemoveContainer" containerID="212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873110 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.873535 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-api" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873550 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-api" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.873562 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd80f17-b1da-4eaa-a962-01cb94a5abe3" containerName="nova-manage" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873568 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd80f17-b1da-4eaa-a962-01cb94a5abe3" containerName="nova-manage" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.873585 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-log" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873592 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-log" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.873600 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-metadata" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873609 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-metadata" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.873635 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-log" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873640 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-log" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873793 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-log" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873807 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-api" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873821 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" containerName="nova-metadata-metadata" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873832 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd80f17-b1da-4eaa-a962-01cb94a5abe3" containerName="nova-manage" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.873847 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" containerName="nova-api-log" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.880957 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.881057 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.883330 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.887231 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.899363 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.901329 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.905609 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.916747 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.917155 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.919379 4735 scope.go:117] "RemoveContainer" containerID="962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.919718 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6\": container with ID starting with 962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6 not found: ID does not exist" containerID="962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.919744 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6"} err="failed to get container status \"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6\": rpc error: code = NotFound desc = could not find container \"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6\": container with ID starting with 962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6 not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.919764 4735 scope.go:117] "RemoveContainer" containerID="212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.920164 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6\": container with ID starting with 212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6 not found: ID does not exist" containerID="212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.920235 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6"} err="failed to get container status \"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6\": rpc error: code = NotFound desc = could not find container \"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6\": container with ID starting with 212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6 not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.920286 4735 scope.go:117] "RemoveContainer" containerID="962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.920965 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6"} err="failed to get container status \"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6\": rpc error: code = NotFound desc = could not find container \"962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6\": container with ID starting with 962b71adb3cbc3a0fffe0d64e642551bf3d97acd2185a513dabf6b509218ebe6 not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.921013 4735 scope.go:117] "RemoveContainer" containerID="212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.921200 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6"} err="failed to get container status \"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6\": rpc error: code = NotFound desc = could not find container \"212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6\": container with ID starting with 212668e2aa32c27d8fbe787d29c11acd75e1fd7cae78eade44b15848fa2d88b6 not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.921218 4735 scope.go:117] "RemoveContainer" containerID="7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.955147 4735 scope.go:117] "RemoveContainer" containerID="a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.955369 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4s8fp"] Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.985578 4735 scope.go:117] "RemoveContainer" containerID="7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.986082 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02\": container with ID starting with 7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02 not found: ID does not exist" containerID="7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.986140 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02"} err="failed to get container status \"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02\": rpc error: code = NotFound desc = could not find container \"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02\": container with ID starting with 7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02 not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.986173 4735 scope.go:117] "RemoveContainer" containerID="a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a" Oct 08 14:29:39 crc kubenswrapper[4735]: E1008 14:29:39.986709 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a\": container with ID starting with a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a not found: ID does not exist" containerID="a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.986757 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a"} err="failed to get container status \"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a\": rpc error: code = NotFound desc = could not find container \"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a\": container with ID starting with a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.986794 4735 scope.go:117] "RemoveContainer" containerID="7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.987160 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02"} err="failed to get container status \"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02\": rpc error: code = NotFound desc = could not find container \"7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02\": container with ID starting with 7e948bf5a1af4b4d92fc8291a27aa63e6d141174f6a822b62aff3aac3d9d4e02 not found: ID does not exist" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.987192 4735 scope.go:117] "RemoveContainer" containerID="a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a" Oct 08 14:29:39 crc kubenswrapper[4735]: I1008 14:29:39.987465 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a"} err="failed to get container status \"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a\": rpc error: code = NotFound desc = could not find container \"a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a\": container with ID starting with a838c54121f4a71889f3c25b80f54c3e8724e811e8f2846ed125a17510a11c1a not found: ID does not exist" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-combined-ca-bundle\") pod \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-config-data\") pod \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028238 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t2rh\" (UniqueName: \"kubernetes.io/projected/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-kube-api-access-8t2rh\") pod \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\" (UID: \"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451\") " Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028542 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96zd7\" (UniqueName: \"kubernetes.io/projected/a928b768-635c-4556-97b9-822f7082aa6c-kube-api-access-96zd7\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028632 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf33775-4c29-4087-9e52-a0b1b1da41a0-logs\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-config-data\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028711 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a928b768-635c-4556-97b9-822f7082aa6c-logs\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6hz9\" (UniqueName: \"kubernetes.io/projected/acf33775-4c29-4087-9e52-a0b1b1da41a0-kube-api-access-t6hz9\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.028831 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-config-data\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.031856 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-kube-api-access-8t2rh" (OuterVolumeSpecName: "kube-api-access-8t2rh") pod "4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" (UID: "4c7d3264-ed20-4cc1-ae4a-c61fa01b3451"). InnerVolumeSpecName "kube-api-access-8t2rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.050822 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-config-data" (OuterVolumeSpecName: "config-data") pod "4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" (UID: "4c7d3264-ed20-4cc1-ae4a-c61fa01b3451"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.051221 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" (UID: "4c7d3264-ed20-4cc1-ae4a-c61fa01b3451"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.130680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-config-data\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.130818 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96zd7\" (UniqueName: \"kubernetes.io/projected/a928b768-635c-4556-97b9-822f7082aa6c-kube-api-access-96zd7\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.130896 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf33775-4c29-4087-9e52-a0b1b1da41a0-logs\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.130924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-config-data\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.130949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a928b768-635c-4556-97b9-822f7082aa6c-logs\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131039 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6hz9\" (UniqueName: \"kubernetes.io/projected/acf33775-4c29-4087-9e52-a0b1b1da41a0-kube-api-access-t6hz9\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131144 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131160 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t2rh\" (UniqueName: \"kubernetes.io/projected/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-kube-api-access-8t2rh\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131176 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.131931 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf33775-4c29-4087-9e52-a0b1b1da41a0-logs\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.132302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a928b768-635c-4556-97b9-822f7082aa6c-logs\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.134798 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-config-data\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.135573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.136108 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-config-data\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.146548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96zd7\" (UniqueName: \"kubernetes.io/projected/a928b768-635c-4556-97b9-822f7082aa6c-kube-api-access-96zd7\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.147032 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.151606 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6hz9\" (UniqueName: \"kubernetes.io/projected/acf33775-4c29-4087-9e52-a0b1b1da41a0-kube-api-access-t6hz9\") pod \"nova-api-0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.208786 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.250643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.354618 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.435313 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.451736 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657648467c-2fvxc"] Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.452013 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-657648467c-2fvxc" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" containerName="dnsmasq-dns" containerID="cri-o://cf39f34fbb11368b315c72124736693180fdba8818ec69fb8d4565e925fb9380" gracePeriod=10 Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.466548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.683654 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:40 crc kubenswrapper[4735]: W1008 14:29:40.693828 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda928b768_635c_4556_97b9_822f7082aa6c.slice/crio-3ebdc0157e0867ad04f87c6ad365bea5340f32d828f3e58bca6ae7a1af318c97 WatchSource:0}: Error finding container 3ebdc0157e0867ad04f87c6ad365bea5340f32d828f3e58bca6ae7a1af318c97: Status 404 returned error can't find the container with id 3ebdc0157e0867ad04f87c6ad365bea5340f32d828f3e58bca6ae7a1af318c97 Oct 08 14:29:40 crc kubenswrapper[4735]: W1008 14:29:40.765649 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf33775_4c29_4087_9e52_a0b1b1da41a0.slice/crio-e9bcc9acbf9581cd4ee3000ab80d9059daf2c5fbcb792be9bd660179647e18f9 WatchSource:0}: Error finding container e9bcc9acbf9581cd4ee3000ab80d9059daf2c5fbcb792be9bd660179647e18f9: Status 404 returned error can't find the container with id e9bcc9acbf9581cd4ee3000ab80d9059daf2c5fbcb792be9bd660179647e18f9 Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.765973 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.820274 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a928b768-635c-4556-97b9-822f7082aa6c","Type":"ContainerStarted","Data":"3ebdc0157e0867ad04f87c6ad365bea5340f32d828f3e58bca6ae7a1af318c97"} Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.825032 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c7d3264-ed20-4cc1-ae4a-c61fa01b3451","Type":"ContainerDied","Data":"2e310303f7388214582368bdc4b46dea0281c0b7270b59edbeafe1b0e8683053"} Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.825103 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.825111 4735 scope.go:117] "RemoveContainer" containerID="bb387538298d15636b8e064921a71f471b4f2e5911f8af470e6649138226a71a" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.838087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acf33775-4c29-4087-9e52-a0b1b1da41a0","Type":"ContainerStarted","Data":"e9bcc9acbf9581cd4ee3000ab80d9059daf2c5fbcb792be9bd660179647e18f9"} Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.852590 4735 generic.go:334] "Generic (PLEG): container finished" podID="2eb52a80-9aca-4074-befb-3062309dfa71" containerID="cf39f34fbb11368b315c72124736693180fdba8818ec69fb8d4565e925fb9380" exitCode=0 Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.853827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657648467c-2fvxc" event={"ID":"2eb52a80-9aca-4074-befb-3062309dfa71","Type":"ContainerDied","Data":"cf39f34fbb11368b315c72124736693180fdba8818ec69fb8d4565e925fb9380"} Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.867771 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.890009 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.915658 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.930307 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2893a8bf-9eeb-4f6c-bde3-a798e72934fc" path="/var/lib/kubelet/pods/2893a8bf-9eeb-4f6c-bde3-a798e72934fc/volumes" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.930918 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec2fb996-87c8-4069-a4d0-af404918cda2" path="/var/lib/kubelet/pods/ec2fb996-87c8-4069-a4d0-af404918cda2/volumes" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.931537 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.931613 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:40 crc kubenswrapper[4735]: E1008 14:29:40.932054 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" containerName="dnsmasq-dns" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.932076 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" containerName="dnsmasq-dns" Oct 08 14:29:40 crc kubenswrapper[4735]: E1008 14:29:40.932093 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" containerName="nova-scheduler-scheduler" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.932101 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" containerName="nova-scheduler-scheduler" Oct 08 14:29:40 crc kubenswrapper[4735]: E1008 14:29:40.932206 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" containerName="init" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.932219 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" containerName="init" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.942178 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" containerName="dnsmasq-dns" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.942513 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" containerName="nova-scheduler-scheduler" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.943180 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.943265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:40 crc kubenswrapper[4735]: I1008 14:29:40.950083 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.052149 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-sb\") pod \"2eb52a80-9aca-4074-befb-3062309dfa71\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.052218 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-dns-svc\") pod \"2eb52a80-9aca-4074-befb-3062309dfa71\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.052261 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94s2\" (UniqueName: \"kubernetes.io/projected/2eb52a80-9aca-4074-befb-3062309dfa71-kube-api-access-k94s2\") pod \"2eb52a80-9aca-4074-befb-3062309dfa71\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.052315 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-nb\") pod \"2eb52a80-9aca-4074-befb-3062309dfa71\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.052434 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-config\") pod \"2eb52a80-9aca-4074-befb-3062309dfa71\" (UID: \"2eb52a80-9aca-4074-befb-3062309dfa71\") " Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.053112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24l2d\" (UniqueName: \"kubernetes.io/projected/47f88b2f-dd56-4ceb-961a-d7816aef22c2-kube-api-access-24l2d\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.053154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.053211 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.056599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb52a80-9aca-4074-befb-3062309dfa71-kube-api-access-k94s2" (OuterVolumeSpecName: "kube-api-access-k94s2") pod "2eb52a80-9aca-4074-befb-3062309dfa71" (UID: "2eb52a80-9aca-4074-befb-3062309dfa71"). InnerVolumeSpecName "kube-api-access-k94s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.110875 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2eb52a80-9aca-4074-befb-3062309dfa71" (UID: "2eb52a80-9aca-4074-befb-3062309dfa71"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.117978 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2eb52a80-9aca-4074-befb-3062309dfa71" (UID: "2eb52a80-9aca-4074-befb-3062309dfa71"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.120589 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2eb52a80-9aca-4074-befb-3062309dfa71" (UID: "2eb52a80-9aca-4074-befb-3062309dfa71"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.127840 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-config" (OuterVolumeSpecName: "config") pod "2eb52a80-9aca-4074-befb-3062309dfa71" (UID: "2eb52a80-9aca-4074-befb-3062309dfa71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154519 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24l2d\" (UniqueName: \"kubernetes.io/projected/47f88b2f-dd56-4ceb-961a-d7816aef22c2-kube-api-access-24l2d\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154806 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154820 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154831 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154839 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94s2\" (UniqueName: \"kubernetes.io/projected/2eb52a80-9aca-4074-befb-3062309dfa71-kube-api-access-k94s2\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.154847 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2eb52a80-9aca-4074-befb-3062309dfa71-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.158985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.160534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.171401 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24l2d\" (UniqueName: \"kubernetes.io/projected/47f88b2f-dd56-4ceb-961a-d7816aef22c2-kube-api-access-24l2d\") pod \"nova-scheduler-0\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.270165 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.714909 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:41 crc kubenswrapper[4735]: W1008 14:29:41.716896 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47f88b2f_dd56_4ceb_961a_d7816aef22c2.slice/crio-a3e196768d8169c894ae791fc5ad86d4573b3d060cc14ca563fe45b6b3bba961 WatchSource:0}: Error finding container a3e196768d8169c894ae791fc5ad86d4573b3d060cc14ca563fe45b6b3bba961: Status 404 returned error can't find the container with id a3e196768d8169c894ae791fc5ad86d4573b3d060cc14ca563fe45b6b3bba961 Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.865168 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acf33775-4c29-4087-9e52-a0b1b1da41a0","Type":"ContainerStarted","Data":"d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2"} Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.865217 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acf33775-4c29-4087-9e52-a0b1b1da41a0","Type":"ContainerStarted","Data":"4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f"} Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.869349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657648467c-2fvxc" event={"ID":"2eb52a80-9aca-4074-befb-3062309dfa71","Type":"ContainerDied","Data":"c029c69022d944233cdc4412e7a784cd87fbe7ded42cebeec72c0a78e806a8ee"} Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.869740 4735 scope.go:117] "RemoveContainer" containerID="cf39f34fbb11368b315c72124736693180fdba8818ec69fb8d4565e925fb9380" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.869611 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657648467c-2fvxc" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.872845 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a928b768-635c-4556-97b9-822f7082aa6c","Type":"ContainerStarted","Data":"6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a"} Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.872967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a928b768-635c-4556-97b9-822f7082aa6c","Type":"ContainerStarted","Data":"7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b"} Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.875226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47f88b2f-dd56-4ceb-961a-d7816aef22c2","Type":"ContainerStarted","Data":"a3e196768d8169c894ae791fc5ad86d4573b3d060cc14ca563fe45b6b3bba961"} Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.875589 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4s8fp" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="registry-server" containerID="cri-o://1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d" gracePeriod=2 Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.884108 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.884088004 podStartE2EDuration="2.884088004s" podCreationTimestamp="2025-10-08 14:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:41.881434776 +0000 UTC m=+5631.689739320" watchObservedRunningTime="2025-10-08 14:29:41.884088004 +0000 UTC m=+5631.692392548" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.900779 4735 scope.go:117] "RemoveContainer" containerID="df09f784f303318c9f4e0ff50599baede28201338b776ef7cb8a76bf936d578b" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.920261 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.92024272 podStartE2EDuration="2.92024272s" podCreationTimestamp="2025-10-08 14:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:41.911630559 +0000 UTC m=+5631.719935123" watchObservedRunningTime="2025-10-08 14:29:41.92024272 +0000 UTC m=+5631.728547264" Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.937371 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657648467c-2fvxc"] Oct 08 14:29:41 crc kubenswrapper[4735]: I1008 14:29:41.944051 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-657648467c-2fvxc"] Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.319370 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.377940 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-catalog-content\") pod \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.378050 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-utilities\") pod \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.378197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6454p\" (UniqueName: \"kubernetes.io/projected/d4b43223-9847-4b56-b7f7-6a3062f4fb30-kube-api-access-6454p\") pod \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\" (UID: \"d4b43223-9847-4b56-b7f7-6a3062f4fb30\") " Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.379906 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-utilities" (OuterVolumeSpecName: "utilities") pod "d4b43223-9847-4b56-b7f7-6a3062f4fb30" (UID: "d4b43223-9847-4b56-b7f7-6a3062f4fb30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.392687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b43223-9847-4b56-b7f7-6a3062f4fb30-kube-api-access-6454p" (OuterVolumeSpecName: "kube-api-access-6454p") pod "d4b43223-9847-4b56-b7f7-6a3062f4fb30" (UID: "d4b43223-9847-4b56-b7f7-6a3062f4fb30"). InnerVolumeSpecName "kube-api-access-6454p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.435681 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b43223-9847-4b56-b7f7-6a3062f4fb30" (UID: "d4b43223-9847-4b56-b7f7-6a3062f4fb30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.481630 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.481677 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b43223-9847-4b56-b7f7-6a3062f4fb30-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.481690 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6454p\" (UniqueName: \"kubernetes.io/projected/d4b43223-9847-4b56-b7f7-6a3062f4fb30-kube-api-access-6454p\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.887666 4735 generic.go:334] "Generic (PLEG): container finished" podID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerID="1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d" exitCode=0 Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.887748 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s8fp" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.887763 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8fp" event={"ID":"d4b43223-9847-4b56-b7f7-6a3062f4fb30","Type":"ContainerDied","Data":"1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d"} Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.887806 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s8fp" event={"ID":"d4b43223-9847-4b56-b7f7-6a3062f4fb30","Type":"ContainerDied","Data":"71b8f693be6728fe4ad0d0cb0503eaa337851d9bcd7e30e068672d5b63292147"} Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.887833 4735 scope.go:117] "RemoveContainer" containerID="1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.915593 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb52a80-9aca-4074-befb-3062309dfa71" path="/var/lib/kubelet/pods/2eb52a80-9aca-4074-befb-3062309dfa71/volumes" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.921592 4735 scope.go:117] "RemoveContainer" containerID="ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.923451 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7d3264-ed20-4cc1-ae4a-c61fa01b3451" path="/var/lib/kubelet/pods/4c7d3264-ed20-4cc1-ae4a-c61fa01b3451/volumes" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.925039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47f88b2f-dd56-4ceb-961a-d7816aef22c2","Type":"ContainerStarted","Data":"2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8"} Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.935680 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4s8fp"] Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.943962 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4s8fp"] Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.951122 4735 scope.go:117] "RemoveContainer" containerID="f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.953927 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.95390666 podStartE2EDuration="2.95390666s" podCreationTimestamp="2025-10-08 14:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:42.949024845 +0000 UTC m=+5632.757329389" watchObservedRunningTime="2025-10-08 14:29:42.95390666 +0000 UTC m=+5632.762211194" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.991139 4735 scope.go:117] "RemoveContainer" containerID="1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d" Oct 08 14:29:42 crc kubenswrapper[4735]: E1008 14:29:42.991939 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d\": container with ID starting with 1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d not found: ID does not exist" containerID="1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.992005 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d"} err="failed to get container status \"1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d\": rpc error: code = NotFound desc = could not find container \"1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d\": container with ID starting with 1d2d6c1cb8e8c7cbfa1960e182f55f7df4458384916377a0b06bc49e1bbd629d not found: ID does not exist" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.992026 4735 scope.go:117] "RemoveContainer" containerID="ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908" Oct 08 14:29:42 crc kubenswrapper[4735]: E1008 14:29:42.993204 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908\": container with ID starting with ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908 not found: ID does not exist" containerID="ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.993237 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908"} err="failed to get container status \"ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908\": rpc error: code = NotFound desc = could not find container \"ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908\": container with ID starting with ac023b0bd0b29b8120389e9b85c3275980098c22cc6b8961eb3421f2f432b908 not found: ID does not exist" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.993252 4735 scope.go:117] "RemoveContainer" containerID="f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537" Oct 08 14:29:42 crc kubenswrapper[4735]: E1008 14:29:42.993611 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537\": container with ID starting with f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537 not found: ID does not exist" containerID="f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537" Oct 08 14:29:42 crc kubenswrapper[4735]: I1008 14:29:42.993646 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537"} err="failed to get container status \"f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537\": rpc error: code = NotFound desc = could not find container \"f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537\": container with ID starting with f2ca4cb1d3d5e4a1acf19f5b6919031f4009d8d11aa3d4b4540ca0139d129537 not found: ID does not exist" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.151282 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.689564 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pn8vf"] Oct 08 14:29:43 crc kubenswrapper[4735]: E1008 14:29:43.690315 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="extract-content" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.690342 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="extract-content" Oct 08 14:29:43 crc kubenswrapper[4735]: E1008 14:29:43.690354 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="registry-server" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.690362 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="registry-server" Oct 08 14:29:43 crc kubenswrapper[4735]: E1008 14:29:43.690410 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="extract-utilities" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.690418 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="extract-utilities" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.690675 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" containerName="registry-server" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.691438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.694488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.694590 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.702829 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pn8vf"] Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.803082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7pl\" (UniqueName: \"kubernetes.io/projected/5112d677-fdd4-4ea0-8b93-fa7871c95519-kube-api-access-gk7pl\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.803169 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.803237 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-config-data\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.803273 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-scripts\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.904292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.904363 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-config-data\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.904396 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-scripts\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.904535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7pl\" (UniqueName: \"kubernetes.io/projected/5112d677-fdd4-4ea0-8b93-fa7871c95519-kube-api-access-gk7pl\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.910308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-config-data\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.911333 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.911828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-scripts\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:43 crc kubenswrapper[4735]: I1008 14:29:43.922881 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7pl\" (UniqueName: \"kubernetes.io/projected/5112d677-fdd4-4ea0-8b93-fa7871c95519-kube-api-access-gk7pl\") pod \"nova-cell1-cell-mapping-pn8vf\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:44 crc kubenswrapper[4735]: I1008 14:29:44.017359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:44 crc kubenswrapper[4735]: I1008 14:29:44.474294 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pn8vf"] Oct 08 14:29:44 crc kubenswrapper[4735]: I1008 14:29:44.904002 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b43223-9847-4b56-b7f7-6a3062f4fb30" path="/var/lib/kubelet/pods/d4b43223-9847-4b56-b7f7-6a3062f4fb30/volumes" Oct 08 14:29:44 crc kubenswrapper[4735]: I1008 14:29:44.929078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pn8vf" event={"ID":"5112d677-fdd4-4ea0-8b93-fa7871c95519","Type":"ContainerStarted","Data":"f893b3a8762fb3a5354e9b9d2f77f696af7198c68561e8bc69d1aacafb8571e0"} Oct 08 14:29:44 crc kubenswrapper[4735]: I1008 14:29:44.929118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pn8vf" event={"ID":"5112d677-fdd4-4ea0-8b93-fa7871c95519","Type":"ContainerStarted","Data":"c88288abe16bca3e1f518d00b46dfacc77d5b2ee8d715a3f7f95b9ddb1090836"} Oct 08 14:29:44 crc kubenswrapper[4735]: I1008 14:29:44.942825 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pn8vf" podStartSLOduration=1.94280904 podStartE2EDuration="1.94280904s" podCreationTimestamp="2025-10-08 14:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:44.942439201 +0000 UTC m=+5634.750743745" watchObservedRunningTime="2025-10-08 14:29:44.94280904 +0000 UTC m=+5634.751113584" Oct 08 14:29:45 crc kubenswrapper[4735]: I1008 14:29:45.209913 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:29:45 crc kubenswrapper[4735]: I1008 14:29:45.210334 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:29:46 crc kubenswrapper[4735]: I1008 14:29:46.270620 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 14:29:49 crc kubenswrapper[4735]: I1008 14:29:49.975268 4735 generic.go:334] "Generic (PLEG): container finished" podID="5112d677-fdd4-4ea0-8b93-fa7871c95519" containerID="f893b3a8762fb3a5354e9b9d2f77f696af7198c68561e8bc69d1aacafb8571e0" exitCode=0 Oct 08 14:29:49 crc kubenswrapper[4735]: I1008 14:29:49.975360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pn8vf" event={"ID":"5112d677-fdd4-4ea0-8b93-fa7871c95519","Type":"ContainerDied","Data":"f893b3a8762fb3a5354e9b9d2f77f696af7198c68561e8bc69d1aacafb8571e0"} Oct 08 14:29:50 crc kubenswrapper[4735]: I1008 14:29:50.209559 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 14:29:50 crc kubenswrapper[4735]: I1008 14:29:50.209681 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 14:29:50 crc kubenswrapper[4735]: I1008 14:29:50.251970 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 14:29:50 crc kubenswrapper[4735]: I1008 14:29:50.252069 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.271356 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.291716 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.65:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.292537 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.65:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.318247 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.339373 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.373824 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.374115 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.461602 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-config-data\") pod \"5112d677-fdd4-4ea0-8b93-fa7871c95519\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.461904 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-scripts\") pod \"5112d677-fdd4-4ea0-8b93-fa7871c95519\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.462593 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-combined-ca-bundle\") pod \"5112d677-fdd4-4ea0-8b93-fa7871c95519\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.462631 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7pl\" (UniqueName: \"kubernetes.io/projected/5112d677-fdd4-4ea0-8b93-fa7871c95519-kube-api-access-gk7pl\") pod \"5112d677-fdd4-4ea0-8b93-fa7871c95519\" (UID: \"5112d677-fdd4-4ea0-8b93-fa7871c95519\") " Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.467977 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-scripts" (OuterVolumeSpecName: "scripts") pod "5112d677-fdd4-4ea0-8b93-fa7871c95519" (UID: "5112d677-fdd4-4ea0-8b93-fa7871c95519"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.480557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5112d677-fdd4-4ea0-8b93-fa7871c95519-kube-api-access-gk7pl" (OuterVolumeSpecName: "kube-api-access-gk7pl") pod "5112d677-fdd4-4ea0-8b93-fa7871c95519" (UID: "5112d677-fdd4-4ea0-8b93-fa7871c95519"). InnerVolumeSpecName "kube-api-access-gk7pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.486981 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-config-data" (OuterVolumeSpecName: "config-data") pod "5112d677-fdd4-4ea0-8b93-fa7871c95519" (UID: "5112d677-fdd4-4ea0-8b93-fa7871c95519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.507799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5112d677-fdd4-4ea0-8b93-fa7871c95519" (UID: "5112d677-fdd4-4ea0-8b93-fa7871c95519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.565612 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.565879 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.565971 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7pl\" (UniqueName: \"kubernetes.io/projected/5112d677-fdd4-4ea0-8b93-fa7871c95519-kube-api-access-gk7pl\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.566050 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5112d677-fdd4-4ea0-8b93-fa7871c95519-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.998637 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pn8vf" Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.998663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pn8vf" event={"ID":"5112d677-fdd4-4ea0-8b93-fa7871c95519","Type":"ContainerDied","Data":"c88288abe16bca3e1f518d00b46dfacc77d5b2ee8d715a3f7f95b9ddb1090836"} Oct 08 14:29:51 crc kubenswrapper[4735]: I1008 14:29:51.998744 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c88288abe16bca3e1f518d00b46dfacc77d5b2ee8d715a3f7f95b9ddb1090836" Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.038347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.174334 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.174674 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-log" containerID="cri-o://4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f" gracePeriod=30 Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.174762 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-api" containerID="cri-o://d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2" gracePeriod=30 Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.209543 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.209827 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-log" containerID="cri-o://7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b" gracePeriod=30 Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.209860 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-metadata" containerID="cri-o://6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a" gracePeriod=30 Oct 08 14:29:52 crc kubenswrapper[4735]: I1008 14:29:52.586217 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:53 crc kubenswrapper[4735]: I1008 14:29:53.010240 4735 generic.go:334] "Generic (PLEG): container finished" podID="a928b768-635c-4556-97b9-822f7082aa6c" containerID="7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b" exitCode=143 Oct 08 14:29:53 crc kubenswrapper[4735]: I1008 14:29:53.010295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a928b768-635c-4556-97b9-822f7082aa6c","Type":"ContainerDied","Data":"7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b"} Oct 08 14:29:53 crc kubenswrapper[4735]: I1008 14:29:53.012571 4735 generic.go:334] "Generic (PLEG): container finished" podID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerID="4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f" exitCode=143 Oct 08 14:29:53 crc kubenswrapper[4735]: I1008 14:29:53.012612 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acf33775-4c29-4087-9e52-a0b1b1da41a0","Type":"ContainerDied","Data":"4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f"} Oct 08 14:29:54 crc kubenswrapper[4735]: I1008 14:29:54.020257 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" containerName="nova-scheduler-scheduler" containerID="cri-o://2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" gracePeriod=30 Oct 08 14:29:55 crc kubenswrapper[4735]: I1008 14:29:55.953010 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.039932 4735 generic.go:334] "Generic (PLEG): container finished" podID="a928b768-635c-4556-97b9-822f7082aa6c" containerID="6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a" exitCode=0 Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.039998 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a928b768-635c-4556-97b9-822f7082aa6c","Type":"ContainerDied","Data":"6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a"} Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.040046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a928b768-635c-4556-97b9-822f7082aa6c","Type":"ContainerDied","Data":"3ebdc0157e0867ad04f87c6ad365bea5340f32d828f3e58bca6ae7a1af318c97"} Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.040071 4735 scope.go:117] "RemoveContainer" containerID="6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.040002 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.046096 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-combined-ca-bundle\") pod \"a928b768-635c-4556-97b9-822f7082aa6c\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.046280 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-config-data\") pod \"a928b768-635c-4556-97b9-822f7082aa6c\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.046422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a928b768-635c-4556-97b9-822f7082aa6c-logs\") pod \"a928b768-635c-4556-97b9-822f7082aa6c\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.046891 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a928b768-635c-4556-97b9-822f7082aa6c-logs" (OuterVolumeSpecName: "logs") pod "a928b768-635c-4556-97b9-822f7082aa6c" (UID: "a928b768-635c-4556-97b9-822f7082aa6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.047574 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96zd7\" (UniqueName: \"kubernetes.io/projected/a928b768-635c-4556-97b9-822f7082aa6c-kube-api-access-96zd7\") pod \"a928b768-635c-4556-97b9-822f7082aa6c\" (UID: \"a928b768-635c-4556-97b9-822f7082aa6c\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.049268 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a928b768-635c-4556-97b9-822f7082aa6c-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.051877 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a928b768-635c-4556-97b9-822f7082aa6c-kube-api-access-96zd7" (OuterVolumeSpecName: "kube-api-access-96zd7") pod "a928b768-635c-4556-97b9-822f7082aa6c" (UID: "a928b768-635c-4556-97b9-822f7082aa6c"). InnerVolumeSpecName "kube-api-access-96zd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.063903 4735 scope.go:117] "RemoveContainer" containerID="7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.072463 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a928b768-635c-4556-97b9-822f7082aa6c" (UID: "a928b768-635c-4556-97b9-822f7082aa6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.073438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-config-data" (OuterVolumeSpecName: "config-data") pod "a928b768-635c-4556-97b9-822f7082aa6c" (UID: "a928b768-635c-4556-97b9-822f7082aa6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.145519 4735 scope.go:117] "RemoveContainer" containerID="6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a" Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.146087 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a\": container with ID starting with 6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a not found: ID does not exist" containerID="6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.146140 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a"} err="failed to get container status \"6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a\": rpc error: code = NotFound desc = could not find container \"6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a\": container with ID starting with 6cf7c8376a7c295e6ec3c8eebfadff23d41cd9df908c5b048ad87e34c83c935a not found: ID does not exist" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.146158 4735 scope.go:117] "RemoveContainer" containerID="7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b" Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.146534 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b\": container with ID starting with 7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b not found: ID does not exist" containerID="7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.146606 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b"} err="failed to get container status \"7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b\": rpc error: code = NotFound desc = could not find container \"7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b\": container with ID starting with 7aa217da6e985cb80829336e0103f6ddd913d61e8c1598796a097328699aa24b not found: ID does not exist" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.151140 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.151188 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96zd7\" (UniqueName: \"kubernetes.io/projected/a928b768-635c-4556-97b9-822f7082aa6c-kube-api-access-96zd7\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.151420 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a928b768-635c-4556-97b9-822f7082aa6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.273398 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.276341 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.279229 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.279312 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" containerName="nova-scheduler-scheduler" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.382212 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.391540 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.405647 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.406196 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-metadata" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.406229 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-metadata" Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.406282 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5112d677-fdd4-4ea0-8b93-fa7871c95519" containerName="nova-manage" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.406301 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5112d677-fdd4-4ea0-8b93-fa7871c95519" containerName="nova-manage" Oct 08 14:29:56 crc kubenswrapper[4735]: E1008 14:29:56.406347 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-log" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.406365 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-log" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.406746 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-metadata" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.406802 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5112d677-fdd4-4ea0-8b93-fa7871c95519" containerName="nova-manage" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.406829 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a928b768-635c-4556-97b9-822f7082aa6c" containerName="nova-metadata-log" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.409724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.413963 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.415435 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.456395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkl2\" (UniqueName: \"kubernetes.io/projected/dd74581a-fd2f-464b-ab89-9a7af5bfa073-kube-api-access-slkl2\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.456754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.456891 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd74581a-fd2f-464b-ab89-9a7af5bfa073-logs\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.457027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-config-data\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.558526 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkl2\" (UniqueName: \"kubernetes.io/projected/dd74581a-fd2f-464b-ab89-9a7af5bfa073-kube-api-access-slkl2\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.558627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.558664 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd74581a-fd2f-464b-ab89-9a7af5bfa073-logs\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.558699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-config-data\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.559437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd74581a-fd2f-464b-ab89-9a7af5bfa073-logs\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.562463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.562469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-config-data\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.579550 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkl2\" (UniqueName: \"kubernetes.io/projected/dd74581a-fd2f-464b-ab89-9a7af5bfa073-kube-api-access-slkl2\") pod \"nova-metadata-0\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.733615 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.872539 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.907883 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a928b768-635c-4556-97b9-822f7082aa6c" path="/var/lib/kubelet/pods/a928b768-635c-4556-97b9-822f7082aa6c/volumes" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.967141 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf33775-4c29-4087-9e52-a0b1b1da41a0-logs\") pod \"acf33775-4c29-4087-9e52-a0b1b1da41a0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.967263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-combined-ca-bundle\") pod \"acf33775-4c29-4087-9e52-a0b1b1da41a0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.967306 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-config-data\") pod \"acf33775-4c29-4087-9e52-a0b1b1da41a0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.967357 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6hz9\" (UniqueName: \"kubernetes.io/projected/acf33775-4c29-4087-9e52-a0b1b1da41a0-kube-api-access-t6hz9\") pod \"acf33775-4c29-4087-9e52-a0b1b1da41a0\" (UID: \"acf33775-4c29-4087-9e52-a0b1b1da41a0\") " Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.967944 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf33775-4c29-4087-9e52-a0b1b1da41a0-logs" (OuterVolumeSpecName: "logs") pod "acf33775-4c29-4087-9e52-a0b1b1da41a0" (UID: "acf33775-4c29-4087-9e52-a0b1b1da41a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.968039 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acf33775-4c29-4087-9e52-a0b1b1da41a0-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.982842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf33775-4c29-4087-9e52-a0b1b1da41a0-kube-api-access-t6hz9" (OuterVolumeSpecName: "kube-api-access-t6hz9") pod "acf33775-4c29-4087-9e52-a0b1b1da41a0" (UID: "acf33775-4c29-4087-9e52-a0b1b1da41a0"). InnerVolumeSpecName "kube-api-access-t6hz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:56 crc kubenswrapper[4735]: I1008 14:29:56.997023 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-config-data" (OuterVolumeSpecName: "config-data") pod "acf33775-4c29-4087-9e52-a0b1b1da41a0" (UID: "acf33775-4c29-4087-9e52-a0b1b1da41a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:56.999966 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acf33775-4c29-4087-9e52-a0b1b1da41a0" (UID: "acf33775-4c29-4087-9e52-a0b1b1da41a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.073524 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.073561 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acf33775-4c29-4087-9e52-a0b1b1da41a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.073571 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6hz9\" (UniqueName: \"kubernetes.io/projected/acf33775-4c29-4087-9e52-a0b1b1da41a0-kube-api-access-t6hz9\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.075048 4735 generic.go:334] "Generic (PLEG): container finished" podID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerID="d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2" exitCode=0 Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.075099 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.075110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acf33775-4c29-4087-9e52-a0b1b1da41a0","Type":"ContainerDied","Data":"d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2"} Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.075136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acf33775-4c29-4087-9e52-a0b1b1da41a0","Type":"ContainerDied","Data":"e9bcc9acbf9581cd4ee3000ab80d9059daf2c5fbcb792be9bd660179647e18f9"} Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.075152 4735 scope.go:117] "RemoveContainer" containerID="d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.105526 4735 scope.go:117] "RemoveContainer" containerID="4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.107142 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.123258 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.135526 4735 scope.go:117] "RemoveContainer" containerID="d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2" Oct 08 14:29:57 crc kubenswrapper[4735]: E1008 14:29:57.136154 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2\": container with ID starting with d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2 not found: ID does not exist" containerID="d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.136195 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2"} err="failed to get container status \"d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2\": rpc error: code = NotFound desc = could not find container \"d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2\": container with ID starting with d4f36a5ef091462fa2ef7d6d0bb25ab41a44c7f60572a01dbed7700b440842d2 not found: ID does not exist" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.136222 4735 scope.go:117] "RemoveContainer" containerID="4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f" Oct 08 14:29:57 crc kubenswrapper[4735]: E1008 14:29:57.136542 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f\": container with ID starting with 4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f not found: ID does not exist" containerID="4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.136572 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f"} err="failed to get container status \"4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f\": rpc error: code = NotFound desc = could not find container \"4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f\": container with ID starting with 4ffb4bce246c36f0f0cf350b0b56cf06bcb71cb20c2b798eb44450b7edbfda4f not found: ID does not exist" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.136804 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:57 crc kubenswrapper[4735]: E1008 14:29:57.137234 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-log" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.137256 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-log" Oct 08 14:29:57 crc kubenswrapper[4735]: E1008 14:29:57.137276 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-api" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.137284 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-api" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.137541 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-api" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.137575 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" containerName="nova-api-log" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.138690 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.140936 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.147440 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.177340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.177427 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhqm\" (UniqueName: \"kubernetes.io/projected/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-kube-api-access-kvhqm\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.177538 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-logs\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.177643 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-config-data\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.181798 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:29:57 crc kubenswrapper[4735]: W1008 14:29:57.183241 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd74581a_fd2f_464b_ab89_9a7af5bfa073.slice/crio-e9b181c6154f70869dfb9eaddbea750568c4220895a406aecf7b6e5ee7896bf2 WatchSource:0}: Error finding container e9b181c6154f70869dfb9eaddbea750568c4220895a406aecf7b6e5ee7896bf2: Status 404 returned error can't find the container with id e9b181c6154f70869dfb9eaddbea750568c4220895a406aecf7b6e5ee7896bf2 Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.279748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-logs\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.279861 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-config-data\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.279889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.279909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhqm\" (UniqueName: \"kubernetes.io/projected/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-kube-api-access-kvhqm\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.280161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-logs\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.283691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.283806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-config-data\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.298018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhqm\" (UniqueName: \"kubernetes.io/projected/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-kube-api-access-kvhqm\") pod \"nova-api-0\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.468612 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:29:57 crc kubenswrapper[4735]: I1008 14:29:57.915352 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.090312 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61c0fe9a-51c8-480e-bcc6-2ade6e36890e","Type":"ContainerStarted","Data":"f3d0e19decdda44db132cf3a302c4b2f03cd97f4945e803469db93627694a5a6"} Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.096838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd74581a-fd2f-464b-ab89-9a7af5bfa073","Type":"ContainerStarted","Data":"505dee2ebead61c62f6a9b4c56e795f6076a57ce69ecf346a6086b822c595a60"} Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.096893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd74581a-fd2f-464b-ab89-9a7af5bfa073","Type":"ContainerStarted","Data":"da3ec38bf3b245be383933aa838dc187fcb6c6aceda4a8f0fd9e6648c90b5319"} Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.096903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd74581a-fd2f-464b-ab89-9a7af5bfa073","Type":"ContainerStarted","Data":"e9b181c6154f70869dfb9eaddbea750568c4220895a406aecf7b6e5ee7896bf2"} Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.122610 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.122588522 podStartE2EDuration="2.122588522s" podCreationTimestamp="2025-10-08 14:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:58.114103335 +0000 UTC m=+5647.922407879" watchObservedRunningTime="2025-10-08 14:29:58.122588522 +0000 UTC m=+5647.930893066" Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.391531 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.502384 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data\") pod \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.502785 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-combined-ca-bundle\") pod \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.502922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24l2d\" (UniqueName: \"kubernetes.io/projected/47f88b2f-dd56-4ceb-961a-d7816aef22c2-kube-api-access-24l2d\") pod \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.505505 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f88b2f-dd56-4ceb-961a-d7816aef22c2-kube-api-access-24l2d" (OuterVolumeSpecName: "kube-api-access-24l2d") pod "47f88b2f-dd56-4ceb-961a-d7816aef22c2" (UID: "47f88b2f-dd56-4ceb-961a-d7816aef22c2"). InnerVolumeSpecName "kube-api-access-24l2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:29:58 crc kubenswrapper[4735]: E1008 14:29:58.526792 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data podName:47f88b2f-dd56-4ceb-961a-d7816aef22c2 nodeName:}" failed. No retries permitted until 2025-10-08 14:29:59.026758201 +0000 UTC m=+5648.835062745 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data") pod "47f88b2f-dd56-4ceb-961a-d7816aef22c2" (UID: "47f88b2f-dd56-4ceb-961a-d7816aef22c2") : error deleting /var/lib/kubelet/pods/47f88b2f-dd56-4ceb-961a-d7816aef22c2/volume-subpaths: remove /var/lib/kubelet/pods/47f88b2f-dd56-4ceb-961a-d7816aef22c2/volume-subpaths: no such file or directory Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.529791 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47f88b2f-dd56-4ceb-961a-d7816aef22c2" (UID: "47f88b2f-dd56-4ceb-961a-d7816aef22c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.604424 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24l2d\" (UniqueName: \"kubernetes.io/projected/47f88b2f-dd56-4ceb-961a-d7816aef22c2-kube-api-access-24l2d\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.604455 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:58 crc kubenswrapper[4735]: I1008 14:29:58.908750 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf33775-4c29-4087-9e52-a0b1b1da41a0" path="/var/lib/kubelet/pods/acf33775-4c29-4087-9e52-a0b1b1da41a0/volumes" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.106607 4735 generic.go:334] "Generic (PLEG): container finished" podID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" exitCode=0 Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.106657 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.106699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47f88b2f-dd56-4ceb-961a-d7816aef22c2","Type":"ContainerDied","Data":"2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8"} Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.106779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"47f88b2f-dd56-4ceb-961a-d7816aef22c2","Type":"ContainerDied","Data":"a3e196768d8169c894ae791fc5ad86d4573b3d060cc14ca563fe45b6b3bba961"} Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.106803 4735 scope.go:117] "RemoveContainer" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.108889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61c0fe9a-51c8-480e-bcc6-2ade6e36890e","Type":"ContainerStarted","Data":"d429c139b5d3f6c7679a3ef1d46c9ab5fa50c16c71cf50744808c1cd36391262"} Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.108924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61c0fe9a-51c8-480e-bcc6-2ade6e36890e","Type":"ContainerStarted","Data":"03fc76a902d3e6b1a3bff19dba3564242f38f6df73747bb40ec88b3cc530b4c9"} Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.114128 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data\") pod \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\" (UID: \"47f88b2f-dd56-4ceb-961a-d7816aef22c2\") " Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.118621 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data" (OuterVolumeSpecName: "config-data") pod "47f88b2f-dd56-4ceb-961a-d7816aef22c2" (UID: "47f88b2f-dd56-4ceb-961a-d7816aef22c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.131521 4735 scope.go:117] "RemoveContainer" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" Oct 08 14:29:59 crc kubenswrapper[4735]: E1008 14:29:59.132298 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8\": container with ID starting with 2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8 not found: ID does not exist" containerID="2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.132334 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8"} err="failed to get container status \"2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8\": rpc error: code = NotFound desc = could not find container \"2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8\": container with ID starting with 2fa31a65f4236942d387bb641d47056e2bd0b9740b19c34b0562754c9f64e1a8 not found: ID does not exist" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.136696 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.13667758 podStartE2EDuration="2.13667758s" podCreationTimestamp="2025-10-08 14:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:29:59.126455378 +0000 UTC m=+5648.934759942" watchObservedRunningTime="2025-10-08 14:29:59.13667758 +0000 UTC m=+5648.944982124" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.217849 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f88b2f-dd56-4ceb-961a-d7816aef22c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.450777 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.472072 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.484256 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:59 crc kubenswrapper[4735]: E1008 14:29:59.485023 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" containerName="nova-scheduler-scheduler" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.485044 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" containerName="nova-scheduler-scheduler" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.485249 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" containerName="nova-scheduler-scheduler" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.485995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.498030 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.499418 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.523553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnm9f\" (UniqueName: \"kubernetes.io/projected/112e1c50-8472-4502-b6f0-a3896dd1cbf1-kube-api-access-qnm9f\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.523647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.523675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-config-data\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.625186 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnm9f\" (UniqueName: \"kubernetes.io/projected/112e1c50-8472-4502-b6f0-a3896dd1cbf1-kube-api-access-qnm9f\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.625260 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.625279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-config-data\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.631105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-config-data\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.637211 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.643703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnm9f\" (UniqueName: \"kubernetes.io/projected/112e1c50-8472-4502-b6f0-a3896dd1cbf1-kube-api-access-qnm9f\") pod \"nova-scheduler-0\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " pod="openstack/nova-scheduler-0" Oct 08 14:29:59 crc kubenswrapper[4735]: I1008 14:29:59.813683 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.136869 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624"] Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.139028 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.146620 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624"] Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.149138 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.149962 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.241090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/619c1be9-6e51-4001-911d-8f706514c9ab-secret-volume\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.241386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9t2s\" (UniqueName: \"kubernetes.io/projected/619c1be9-6e51-4001-911d-8f706514c9ab-kube-api-access-x9t2s\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.241420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/619c1be9-6e51-4001-911d-8f706514c9ab-config-volume\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.255306 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:30:00 crc kubenswrapper[4735]: W1008 14:30:00.259146 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112e1c50_8472_4502_b6f0_a3896dd1cbf1.slice/crio-4c09fbd06cac93171a3bb67cef011e68afe39cca9e00a79be6b57faa5bffc5d3 WatchSource:0}: Error finding container 4c09fbd06cac93171a3bb67cef011e68afe39cca9e00a79be6b57faa5bffc5d3: Status 404 returned error can't find the container with id 4c09fbd06cac93171a3bb67cef011e68afe39cca9e00a79be6b57faa5bffc5d3 Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.343407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9t2s\" (UniqueName: \"kubernetes.io/projected/619c1be9-6e51-4001-911d-8f706514c9ab-kube-api-access-x9t2s\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.343463 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/619c1be9-6e51-4001-911d-8f706514c9ab-config-volume\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.343529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/619c1be9-6e51-4001-911d-8f706514c9ab-secret-volume\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.344747 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/619c1be9-6e51-4001-911d-8f706514c9ab-config-volume\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.349590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/619c1be9-6e51-4001-911d-8f706514c9ab-secret-volume\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.361543 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9t2s\" (UniqueName: \"kubernetes.io/projected/619c1be9-6e51-4001-911d-8f706514c9ab-kube-api-access-x9t2s\") pod \"collect-profiles-29332230-cb624\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.470812 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.701769 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624"] Oct 08 14:30:00 crc kubenswrapper[4735]: W1008 14:30:00.711115 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619c1be9_6e51_4001_911d_8f706514c9ab.slice/crio-aa0d24c8ba22ec271ddb8aa7f8890ef0ea651487d3efa282dbb8fa299e719c87 WatchSource:0}: Error finding container aa0d24c8ba22ec271ddb8aa7f8890ef0ea651487d3efa282dbb8fa299e719c87: Status 404 returned error can't find the container with id aa0d24c8ba22ec271ddb8aa7f8890ef0ea651487d3efa282dbb8fa299e719c87 Oct 08 14:30:00 crc kubenswrapper[4735]: I1008 14:30:00.907701 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f88b2f-dd56-4ceb-961a-d7816aef22c2" path="/var/lib/kubelet/pods/47f88b2f-dd56-4ceb-961a-d7816aef22c2/volumes" Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.135757 4735 generic.go:334] "Generic (PLEG): container finished" podID="619c1be9-6e51-4001-911d-8f706514c9ab" containerID="cb74aa4282f4417baa1946a3cd3ff2771bbce1f851aff0996ae8603f35013977" exitCode=0 Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.135842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" event={"ID":"619c1be9-6e51-4001-911d-8f706514c9ab","Type":"ContainerDied","Data":"cb74aa4282f4417baa1946a3cd3ff2771bbce1f851aff0996ae8603f35013977"} Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.135867 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" event={"ID":"619c1be9-6e51-4001-911d-8f706514c9ab","Type":"ContainerStarted","Data":"aa0d24c8ba22ec271ddb8aa7f8890ef0ea651487d3efa282dbb8fa299e719c87"} Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.140932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"112e1c50-8472-4502-b6f0-a3896dd1cbf1","Type":"ContainerStarted","Data":"298137c915b8b2a1e9ee074e5de91a06c04946fe5cccb1de050e4339d69164b7"} Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.140979 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"112e1c50-8472-4502-b6f0-a3896dd1cbf1","Type":"ContainerStarted","Data":"4c09fbd06cac93171a3bb67cef011e68afe39cca9e00a79be6b57faa5bffc5d3"} Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.170320 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.170301597 podStartE2EDuration="2.170301597s" podCreationTimestamp="2025-10-08 14:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:30:01.168519581 +0000 UTC m=+5650.976824125" watchObservedRunningTime="2025-10-08 14:30:01.170301597 +0000 UTC m=+5650.978606141" Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.734225 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:30:01 crc kubenswrapper[4735]: I1008 14:30:01.734528 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.461242 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.490865 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/619c1be9-6e51-4001-911d-8f706514c9ab-secret-volume\") pod \"619c1be9-6e51-4001-911d-8f706514c9ab\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.496273 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619c1be9-6e51-4001-911d-8f706514c9ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "619c1be9-6e51-4001-911d-8f706514c9ab" (UID: "619c1be9-6e51-4001-911d-8f706514c9ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.592823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9t2s\" (UniqueName: \"kubernetes.io/projected/619c1be9-6e51-4001-911d-8f706514c9ab-kube-api-access-x9t2s\") pod \"619c1be9-6e51-4001-911d-8f706514c9ab\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.592885 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/619c1be9-6e51-4001-911d-8f706514c9ab-config-volume\") pod \"619c1be9-6e51-4001-911d-8f706514c9ab\" (UID: \"619c1be9-6e51-4001-911d-8f706514c9ab\") " Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.593288 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/619c1be9-6e51-4001-911d-8f706514c9ab-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.593739 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619c1be9-6e51-4001-911d-8f706514c9ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "619c1be9-6e51-4001-911d-8f706514c9ab" (UID: "619c1be9-6e51-4001-911d-8f706514c9ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.597231 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619c1be9-6e51-4001-911d-8f706514c9ab-kube-api-access-x9t2s" (OuterVolumeSpecName: "kube-api-access-x9t2s") pod "619c1be9-6e51-4001-911d-8f706514c9ab" (UID: "619c1be9-6e51-4001-911d-8f706514c9ab"). InnerVolumeSpecName "kube-api-access-x9t2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.694643 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9t2s\" (UniqueName: \"kubernetes.io/projected/619c1be9-6e51-4001-911d-8f706514c9ab-kube-api-access-x9t2s\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:02 crc kubenswrapper[4735]: I1008 14:30:02.694679 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/619c1be9-6e51-4001-911d-8f706514c9ab-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:03 crc kubenswrapper[4735]: I1008 14:30:03.178541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" event={"ID":"619c1be9-6e51-4001-911d-8f706514c9ab","Type":"ContainerDied","Data":"aa0d24c8ba22ec271ddb8aa7f8890ef0ea651487d3efa282dbb8fa299e719c87"} Oct 08 14:30:03 crc kubenswrapper[4735]: I1008 14:30:03.178590 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa0d24c8ba22ec271ddb8aa7f8890ef0ea651487d3efa282dbb8fa299e719c87" Oct 08 14:30:03 crc kubenswrapper[4735]: I1008 14:30:03.178645 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332230-cb624" Oct 08 14:30:03 crc kubenswrapper[4735]: I1008 14:30:03.545132 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q"] Oct 08 14:30:03 crc kubenswrapper[4735]: I1008 14:30:03.556957 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332185-8q22q"] Oct 08 14:30:04 crc kubenswrapper[4735]: I1008 14:30:04.814099 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 14:30:04 crc kubenswrapper[4735]: I1008 14:30:04.907560 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d927ff-0031-4725-a576-01a9f4cf5fa3" path="/var/lib/kubelet/pods/29d927ff-0031-4725-a576-01a9f4cf5fa3/volumes" Oct 08 14:30:06 crc kubenswrapper[4735]: I1008 14:30:06.734125 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 14:30:06 crc kubenswrapper[4735]: I1008 14:30:06.734470 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 14:30:07 crc kubenswrapper[4735]: I1008 14:30:07.469761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 14:30:07 crc kubenswrapper[4735]: I1008 14:30:07.469825 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 14:30:07 crc kubenswrapper[4735]: I1008 14:30:07.816796 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:30:07 crc kubenswrapper[4735]: I1008 14:30:07.817658 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:30:08 crc kubenswrapper[4735]: I1008 14:30:08.551716 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:30:08 crc kubenswrapper[4735]: I1008 14:30:08.551713 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:30:08 crc kubenswrapper[4735]: I1008 14:30:08.663427 4735 scope.go:117] "RemoveContainer" containerID="d0eed91976ed4bbff74fdaf27b347df9f7c2448c4ddefb25b4adb99d96d9e574" Oct 08 14:30:09 crc kubenswrapper[4735]: I1008 14:30:09.814982 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 14:30:09 crc kubenswrapper[4735]: I1008 14:30:09.846949 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 14:30:10 crc kubenswrapper[4735]: I1008 14:30:10.279969 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 14:30:16 crc kubenswrapper[4735]: I1008 14:30:16.735918 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 14:30:16 crc kubenswrapper[4735]: I1008 14:30:16.737469 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 14:30:16 crc kubenswrapper[4735]: I1008 14:30:16.739977 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 14:30:16 crc kubenswrapper[4735]: I1008 14:30:16.740221 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 14:30:17 crc kubenswrapper[4735]: I1008 14:30:17.472801 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 14:30:17 crc kubenswrapper[4735]: I1008 14:30:17.473293 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 14:30:17 crc kubenswrapper[4735]: I1008 14:30:17.473463 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 14:30:17 crc kubenswrapper[4735]: I1008 14:30:17.476758 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.318087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.321508 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.475542 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffb5d785-9mlw8"] Oct 08 14:30:18 crc kubenswrapper[4735]: E1008 14:30:18.476016 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619c1be9-6e51-4001-911d-8f706514c9ab" containerName="collect-profiles" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.476033 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="619c1be9-6e51-4001-911d-8f706514c9ab" containerName="collect-profiles" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.476319 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="619c1be9-6e51-4001-911d-8f706514c9ab" containerName="collect-profiles" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.477519 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.502729 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffb5d785-9mlw8"] Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.596626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.596692 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-config\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.596773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-dns-svc\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.596800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.596993 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frvb\" (UniqueName: \"kubernetes.io/projected/478e165d-399c-4622-a891-707f27dfb3a9-kube-api-access-6frvb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.698511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.698794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-config\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.698926 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-dns-svc\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.699017 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.699142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frvb\" (UniqueName: \"kubernetes.io/projected/478e165d-399c-4622-a891-707f27dfb3a9-kube-api-access-6frvb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.699659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-config\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.699712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.699848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-dns-svc\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.699848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.717273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frvb\" (UniqueName: \"kubernetes.io/projected/478e165d-399c-4622-a891-707f27dfb3a9-kube-api-access-6frvb\") pod \"dnsmasq-dns-54ffb5d785-9mlw8\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:18 crc kubenswrapper[4735]: I1008 14:30:18.840930 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:19 crc kubenswrapper[4735]: I1008 14:30:19.343305 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffb5d785-9mlw8"] Oct 08 14:30:20 crc kubenswrapper[4735]: I1008 14:30:20.347469 4735 generic.go:334] "Generic (PLEG): container finished" podID="478e165d-399c-4622-a891-707f27dfb3a9" containerID="2cfa2e7f5991883df2b2c37746c18be592eb125c9a18b7eb5e6f8f669a5ecb8e" exitCode=0 Oct 08 14:30:20 crc kubenswrapper[4735]: I1008 14:30:20.347633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" event={"ID":"478e165d-399c-4622-a891-707f27dfb3a9","Type":"ContainerDied","Data":"2cfa2e7f5991883df2b2c37746c18be592eb125c9a18b7eb5e6f8f669a5ecb8e"} Oct 08 14:30:20 crc kubenswrapper[4735]: I1008 14:30:20.348794 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" event={"ID":"478e165d-399c-4622-a891-707f27dfb3a9","Type":"ContainerStarted","Data":"59c3c8090fad98d31f17c9dafd0412f9dc58c11ee17c3b54104cafee1a883ef1"} Oct 08 14:30:21 crc kubenswrapper[4735]: I1008 14:30:21.359589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" event={"ID":"478e165d-399c-4622-a891-707f27dfb3a9","Type":"ContainerStarted","Data":"0d879d06773e030be831f4aac39631588adbde3549f50938d9d9ab91de66e4d7"} Oct 08 14:30:21 crc kubenswrapper[4735]: I1008 14:30:21.360951 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:21 crc kubenswrapper[4735]: I1008 14:30:21.377842 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" podStartSLOduration=3.377826309 podStartE2EDuration="3.377826309s" podCreationTimestamp="2025-10-08 14:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:30:21.375294374 +0000 UTC m=+5671.183598918" watchObservedRunningTime="2025-10-08 14:30:21.377826309 +0000 UTC m=+5671.186130843" Oct 08 14:30:28 crc kubenswrapper[4735]: I1008 14:30:28.842824 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:30:28 crc kubenswrapper[4735]: I1008 14:30:28.931530 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb999bbc-tspnr"] Oct 08 14:30:28 crc kubenswrapper[4735]: I1008 14:30:28.931806 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerName="dnsmasq-dns" containerID="cri-o://011af5768c65232d1025dd12d7b7598b8a636dfe23290854dae0231044287aa2" gracePeriod=10 Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.045471 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5vx7d"] Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.048398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.061197 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vx7d"] Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.192436 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-798rc\" (UniqueName: \"kubernetes.io/projected/ea24f16f-d9cc-4498-8316-5464b301c963-kube-api-access-798rc\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.192506 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-utilities\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.192545 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-catalog-content\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.293879 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-catalog-content\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.294242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-798rc\" (UniqueName: \"kubernetes.io/projected/ea24f16f-d9cc-4498-8316-5464b301c963-kube-api-access-798rc\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.294349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-utilities\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.294855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-utilities\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.299725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-catalog-content\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.318266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-798rc\" (UniqueName: \"kubernetes.io/projected/ea24f16f-d9cc-4498-8316-5464b301c963-kube-api-access-798rc\") pod \"redhat-operators-5vx7d\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.415172 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.438186 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerID="011af5768c65232d1025dd12d7b7598b8a636dfe23290854dae0231044287aa2" exitCode=0 Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.438854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" event={"ID":"ba27f8b5-fea5-4552-b5ce-40415d883627","Type":"ContainerDied","Data":"011af5768c65232d1025dd12d7b7598b8a636dfe23290854dae0231044287aa2"} Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.438893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" event={"ID":"ba27f8b5-fea5-4552-b5ce-40415d883627","Type":"ContainerDied","Data":"e254c8ab4cba14d447631690a5ad1226a032eb72a0e7ad0ee1e3905417a67300"} Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.438935 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e254c8ab4cba14d447631690a5ad1226a032eb72a0e7ad0ee1e3905417a67300" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.568894 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.701609 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-nb\") pod \"ba27f8b5-fea5-4552-b5ce-40415d883627\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.701727 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smpmv\" (UniqueName: \"kubernetes.io/projected/ba27f8b5-fea5-4552-b5ce-40415d883627-kube-api-access-smpmv\") pod \"ba27f8b5-fea5-4552-b5ce-40415d883627\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.701807 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-sb\") pod \"ba27f8b5-fea5-4552-b5ce-40415d883627\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.701892 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-dns-svc\") pod \"ba27f8b5-fea5-4552-b5ce-40415d883627\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.701930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-config\") pod \"ba27f8b5-fea5-4552-b5ce-40415d883627\" (UID: \"ba27f8b5-fea5-4552-b5ce-40415d883627\") " Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.708329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba27f8b5-fea5-4552-b5ce-40415d883627-kube-api-access-smpmv" (OuterVolumeSpecName: "kube-api-access-smpmv") pod "ba27f8b5-fea5-4552-b5ce-40415d883627" (UID: "ba27f8b5-fea5-4552-b5ce-40415d883627"). InnerVolumeSpecName "kube-api-access-smpmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.765951 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba27f8b5-fea5-4552-b5ce-40415d883627" (UID: "ba27f8b5-fea5-4552-b5ce-40415d883627"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.779398 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-config" (OuterVolumeSpecName: "config") pod "ba27f8b5-fea5-4552-b5ce-40415d883627" (UID: "ba27f8b5-fea5-4552-b5ce-40415d883627"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.784373 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba27f8b5-fea5-4552-b5ce-40415d883627" (UID: "ba27f8b5-fea5-4552-b5ce-40415d883627"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.787053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba27f8b5-fea5-4552-b5ce-40415d883627" (UID: "ba27f8b5-fea5-4552-b5ce-40415d883627"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.809412 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.809445 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smpmv\" (UniqueName: \"kubernetes.io/projected/ba27f8b5-fea5-4552-b5ce-40415d883627-kube-api-access-smpmv\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.809456 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.809465 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.809494 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba27f8b5-fea5-4552-b5ce-40415d883627-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:29 crc kubenswrapper[4735]: I1008 14:30:29.905059 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vx7d"] Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.449839 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea24f16f-d9cc-4498-8316-5464b301c963" containerID="42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6" exitCode=0 Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.449942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerDied","Data":"42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6"} Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.449981 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb999bbc-tspnr" Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.449993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerStarted","Data":"8f9e7a2a6f79e33aa43eba5a19e5617f2bfe1fed95b7535196a225892bb07a9a"} Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.508219 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb999bbc-tspnr"] Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.518114 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb999bbc-tspnr"] Oct 08 14:30:30 crc kubenswrapper[4735]: I1008 14:30:30.907153 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" path="/var/lib/kubelet/pods/ba27f8b5-fea5-4552-b5ce-40415d883627/volumes" Oct 08 14:30:31 crc kubenswrapper[4735]: I1008 14:30:31.458200 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerStarted","Data":"d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984"} Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.470259 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea24f16f-d9cc-4498-8316-5464b301c963" containerID="d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984" exitCode=0 Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.470320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerDied","Data":"d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984"} Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.675461 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ww4kk"] Oct 08 14:30:32 crc kubenswrapper[4735]: E1008 14:30:32.675878 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerName="dnsmasq-dns" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.675916 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerName="dnsmasq-dns" Oct 08 14:30:32 crc kubenswrapper[4735]: E1008 14:30:32.675949 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerName="init" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.675954 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerName="init" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.676122 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba27f8b5-fea5-4552-b5ce-40415d883627" containerName="dnsmasq-dns" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.676763 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.683258 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ww4kk"] Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.755298 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mh7z\" (UniqueName: \"kubernetes.io/projected/56bccda4-2090-4e33-9fc4-ff1f852c6fef-kube-api-access-8mh7z\") pod \"cinder-db-create-ww4kk\" (UID: \"56bccda4-2090-4e33-9fc4-ff1f852c6fef\") " pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.857303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mh7z\" (UniqueName: \"kubernetes.io/projected/56bccda4-2090-4e33-9fc4-ff1f852c6fef-kube-api-access-8mh7z\") pod \"cinder-db-create-ww4kk\" (UID: \"56bccda4-2090-4e33-9fc4-ff1f852c6fef\") " pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.874938 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mh7z\" (UniqueName: \"kubernetes.io/projected/56bccda4-2090-4e33-9fc4-ff1f852c6fef-kube-api-access-8mh7z\") pod \"cinder-db-create-ww4kk\" (UID: \"56bccda4-2090-4e33-9fc4-ff1f852c6fef\") " pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:32 crc kubenswrapper[4735]: I1008 14:30:32.997791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:33 crc kubenswrapper[4735]: W1008 14:30:33.462511 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56bccda4_2090_4e33_9fc4_ff1f852c6fef.slice/crio-d238a78160e2d2eafeacfe1adc264e58a21274ec77f98bc2d27ce92388d36289 WatchSource:0}: Error finding container d238a78160e2d2eafeacfe1adc264e58a21274ec77f98bc2d27ce92388d36289: Status 404 returned error can't find the container with id d238a78160e2d2eafeacfe1adc264e58a21274ec77f98bc2d27ce92388d36289 Oct 08 14:30:33 crc kubenswrapper[4735]: I1008 14:30:33.462652 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ww4kk"] Oct 08 14:30:33 crc kubenswrapper[4735]: I1008 14:30:33.487961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ww4kk" event={"ID":"56bccda4-2090-4e33-9fc4-ff1f852c6fef","Type":"ContainerStarted","Data":"d238a78160e2d2eafeacfe1adc264e58a21274ec77f98bc2d27ce92388d36289"} Oct 08 14:30:33 crc kubenswrapper[4735]: I1008 14:30:33.502612 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerStarted","Data":"f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2"} Oct 08 14:30:33 crc kubenswrapper[4735]: I1008 14:30:33.537587 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5vx7d" podStartSLOduration=1.974451986 podStartE2EDuration="4.537560301s" podCreationTimestamp="2025-10-08 14:30:29 +0000 UTC" firstStartedPulling="2025-10-08 14:30:30.451821312 +0000 UTC m=+5680.260125856" lastFinishedPulling="2025-10-08 14:30:33.014929627 +0000 UTC m=+5682.823234171" observedRunningTime="2025-10-08 14:30:33.535552949 +0000 UTC m=+5683.343857503" watchObservedRunningTime="2025-10-08 14:30:33.537560301 +0000 UTC m=+5683.345864855" Oct 08 14:30:34 crc kubenswrapper[4735]: I1008 14:30:34.512357 4735 generic.go:334] "Generic (PLEG): container finished" podID="56bccda4-2090-4e33-9fc4-ff1f852c6fef" containerID="cf054f7ce53eabb98051e30c2b9837257155604b3a03de291a0866fea38c3cfa" exitCode=0 Oct 08 14:30:34 crc kubenswrapper[4735]: I1008 14:30:34.514098 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ww4kk" event={"ID":"56bccda4-2090-4e33-9fc4-ff1f852c6fef","Type":"ContainerDied","Data":"cf054f7ce53eabb98051e30c2b9837257155604b3a03de291a0866fea38c3cfa"} Oct 08 14:30:35 crc kubenswrapper[4735]: I1008 14:30:35.870025 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:36 crc kubenswrapper[4735]: I1008 14:30:36.011940 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mh7z\" (UniqueName: \"kubernetes.io/projected/56bccda4-2090-4e33-9fc4-ff1f852c6fef-kube-api-access-8mh7z\") pod \"56bccda4-2090-4e33-9fc4-ff1f852c6fef\" (UID: \"56bccda4-2090-4e33-9fc4-ff1f852c6fef\") " Oct 08 14:30:36 crc kubenswrapper[4735]: I1008 14:30:36.017903 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bccda4-2090-4e33-9fc4-ff1f852c6fef-kube-api-access-8mh7z" (OuterVolumeSpecName: "kube-api-access-8mh7z") pod "56bccda4-2090-4e33-9fc4-ff1f852c6fef" (UID: "56bccda4-2090-4e33-9fc4-ff1f852c6fef"). InnerVolumeSpecName "kube-api-access-8mh7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:30:36 crc kubenswrapper[4735]: I1008 14:30:36.114804 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mh7z\" (UniqueName: \"kubernetes.io/projected/56bccda4-2090-4e33-9fc4-ff1f852c6fef-kube-api-access-8mh7z\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:36 crc kubenswrapper[4735]: I1008 14:30:36.529422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ww4kk" event={"ID":"56bccda4-2090-4e33-9fc4-ff1f852c6fef","Type":"ContainerDied","Data":"d238a78160e2d2eafeacfe1adc264e58a21274ec77f98bc2d27ce92388d36289"} Oct 08 14:30:36 crc kubenswrapper[4735]: I1008 14:30:36.529462 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d238a78160e2d2eafeacfe1adc264e58a21274ec77f98bc2d27ce92388d36289" Oct 08 14:30:36 crc kubenswrapper[4735]: I1008 14:30:36.529514 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ww4kk" Oct 08 14:30:39 crc kubenswrapper[4735]: I1008 14:30:39.416069 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:39 crc kubenswrapper[4735]: I1008 14:30:39.417053 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:39 crc kubenswrapper[4735]: I1008 14:30:39.456247 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:39 crc kubenswrapper[4735]: I1008 14:30:39.615986 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:39 crc kubenswrapper[4735]: I1008 14:30:39.692458 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vx7d"] Oct 08 14:30:41 crc kubenswrapper[4735]: I1008 14:30:41.571816 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5vx7d" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="registry-server" containerID="cri-o://f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2" gracePeriod=2 Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.057949 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.125540 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-utilities\") pod \"ea24f16f-d9cc-4498-8316-5464b301c963\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.125638 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-catalog-content\") pod \"ea24f16f-d9cc-4498-8316-5464b301c963\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.125744 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-798rc\" (UniqueName: \"kubernetes.io/projected/ea24f16f-d9cc-4498-8316-5464b301c963-kube-api-access-798rc\") pod \"ea24f16f-d9cc-4498-8316-5464b301c963\" (UID: \"ea24f16f-d9cc-4498-8316-5464b301c963\") " Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.126438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-utilities" (OuterVolumeSpecName: "utilities") pod "ea24f16f-d9cc-4498-8316-5464b301c963" (UID: "ea24f16f-d9cc-4498-8316-5464b301c963"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.132658 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea24f16f-d9cc-4498-8316-5464b301c963-kube-api-access-798rc" (OuterVolumeSpecName: "kube-api-access-798rc") pod "ea24f16f-d9cc-4498-8316-5464b301c963" (UID: "ea24f16f-d9cc-4498-8316-5464b301c963"). InnerVolumeSpecName "kube-api-access-798rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.227712 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-798rc\" (UniqueName: \"kubernetes.io/projected/ea24f16f-d9cc-4498-8316-5464b301c963-kube-api-access-798rc\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.227744 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.584531 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea24f16f-d9cc-4498-8316-5464b301c963" containerID="f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2" exitCode=0 Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.584587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerDied","Data":"f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2"} Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.584618 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vx7d" event={"ID":"ea24f16f-d9cc-4498-8316-5464b301c963","Type":"ContainerDied","Data":"8f9e7a2a6f79e33aa43eba5a19e5617f2bfe1fed95b7535196a225892bb07a9a"} Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.584642 4735 scope.go:117] "RemoveContainer" containerID="f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.584742 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vx7d" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.609868 4735 scope.go:117] "RemoveContainer" containerID="d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.632694 4735 scope.go:117] "RemoveContainer" containerID="42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.684637 4735 scope.go:117] "RemoveContainer" containerID="f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2" Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.687360 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2\": container with ID starting with f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2 not found: ID does not exist" containerID="f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.687418 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2"} err="failed to get container status \"f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2\": rpc error: code = NotFound desc = could not find container \"f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2\": container with ID starting with f1214d4860570a5c59c65a4ee2249b012fabb6f08e3f85c901548c52e23166b2 not found: ID does not exist" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.687453 4735 scope.go:117] "RemoveContainer" containerID="d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984" Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.687971 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984\": container with ID starting with d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984 not found: ID does not exist" containerID="d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.687996 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984"} err="failed to get container status \"d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984\": rpc error: code = NotFound desc = could not find container \"d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984\": container with ID starting with d923ae28541e1beb43cd1a1cdb52035c80c57bc167ed32b4b6b41c3f2ade9984 not found: ID does not exist" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.688014 4735 scope.go:117] "RemoveContainer" containerID="42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6" Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.689181 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6\": container with ID starting with 42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6 not found: ID does not exist" containerID="42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.689205 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6"} err="failed to get container status \"42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6\": rpc error: code = NotFound desc = could not find container \"42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6\": container with ID starting with 42f88e611be63eea34ac390cfe5a010d8462fc1fd5a48dacc6be742087d31eb6 not found: ID does not exist" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.801242 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-aa99-account-create-ssv2z"] Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.801911 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bccda4-2090-4e33-9fc4-ff1f852c6fef" containerName="mariadb-database-create" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.801937 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bccda4-2090-4e33-9fc4-ff1f852c6fef" containerName="mariadb-database-create" Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.801972 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="extract-utilities" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.801981 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="extract-utilities" Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.801995 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="extract-content" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.802007 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="extract-content" Oct 08 14:30:42 crc kubenswrapper[4735]: E1008 14:30:42.802034 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="registry-server" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.802045 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="registry-server" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.802267 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" containerName="registry-server" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.802304 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bccda4-2090-4e33-9fc4-ff1f852c6fef" containerName="mariadb-database-create" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.803137 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.805879 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.810552 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-aa99-account-create-ssv2z"] Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.838725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh676\" (UniqueName: \"kubernetes.io/projected/5576d690-ea6f-4fdc-aef8-936a637dfb11-kube-api-access-hh676\") pod \"cinder-aa99-account-create-ssv2z\" (UID: \"5576d690-ea6f-4fdc-aef8-936a637dfb11\") " pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.940313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh676\" (UniqueName: \"kubernetes.io/projected/5576d690-ea6f-4fdc-aef8-936a637dfb11-kube-api-access-hh676\") pod \"cinder-aa99-account-create-ssv2z\" (UID: \"5576d690-ea6f-4fdc-aef8-936a637dfb11\") " pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:42 crc kubenswrapper[4735]: I1008 14:30:42.957562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh676\" (UniqueName: \"kubernetes.io/projected/5576d690-ea6f-4fdc-aef8-936a637dfb11-kube-api-access-hh676\") pod \"cinder-aa99-account-create-ssv2z\" (UID: \"5576d690-ea6f-4fdc-aef8-936a637dfb11\") " pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.050995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea24f16f-d9cc-4498-8316-5464b301c963" (UID: "ea24f16f-d9cc-4498-8316-5464b301c963"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.133245 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.144546 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea24f16f-d9cc-4498-8316-5464b301c963-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.250959 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vx7d"] Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.260085 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5vx7d"] Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.580564 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-aa99-account-create-ssv2z"] Oct 08 14:30:43 crc kubenswrapper[4735]: I1008 14:30:43.595374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-aa99-account-create-ssv2z" event={"ID":"5576d690-ea6f-4fdc-aef8-936a637dfb11","Type":"ContainerStarted","Data":"74c3791df5801680c068eea461a8524966e68291e5d76950c3c17ccc37eb6e10"} Oct 08 14:30:44 crc kubenswrapper[4735]: I1008 14:30:44.609165 4735 generic.go:334] "Generic (PLEG): container finished" podID="5576d690-ea6f-4fdc-aef8-936a637dfb11" containerID="4483a05a5ed63a95ca13598dcd4f9c3259ff569835b4aec1abaff98c9d82a6f2" exitCode=0 Oct 08 14:30:44 crc kubenswrapper[4735]: I1008 14:30:44.609325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-aa99-account-create-ssv2z" event={"ID":"5576d690-ea6f-4fdc-aef8-936a637dfb11","Type":"ContainerDied","Data":"4483a05a5ed63a95ca13598dcd4f9c3259ff569835b4aec1abaff98c9d82a6f2"} Oct 08 14:30:44 crc kubenswrapper[4735]: I1008 14:30:44.907103 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea24f16f-d9cc-4498-8316-5464b301c963" path="/var/lib/kubelet/pods/ea24f16f-d9cc-4498-8316-5464b301c963/volumes" Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.006084 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.095920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh676\" (UniqueName: \"kubernetes.io/projected/5576d690-ea6f-4fdc-aef8-936a637dfb11-kube-api-access-hh676\") pod \"5576d690-ea6f-4fdc-aef8-936a637dfb11\" (UID: \"5576d690-ea6f-4fdc-aef8-936a637dfb11\") " Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.101802 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5576d690-ea6f-4fdc-aef8-936a637dfb11-kube-api-access-hh676" (OuterVolumeSpecName: "kube-api-access-hh676") pod "5576d690-ea6f-4fdc-aef8-936a637dfb11" (UID: "5576d690-ea6f-4fdc-aef8-936a637dfb11"). InnerVolumeSpecName "kube-api-access-hh676". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.197578 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh676\" (UniqueName: \"kubernetes.io/projected/5576d690-ea6f-4fdc-aef8-936a637dfb11-kube-api-access-hh676\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.628319 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-aa99-account-create-ssv2z" event={"ID":"5576d690-ea6f-4fdc-aef8-936a637dfb11","Type":"ContainerDied","Data":"74c3791df5801680c068eea461a8524966e68291e5d76950c3c17ccc37eb6e10"} Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.628368 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c3791df5801680c068eea461a8524966e68291e5d76950c3c17ccc37eb6e10" Oct 08 14:30:46 crc kubenswrapper[4735]: I1008 14:30:46.628376 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aa99-account-create-ssv2z" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.038353 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g67m4"] Oct 08 14:30:48 crc kubenswrapper[4735]: E1008 14:30:48.039151 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5576d690-ea6f-4fdc-aef8-936a637dfb11" containerName="mariadb-account-create" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.039167 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5576d690-ea6f-4fdc-aef8-936a637dfb11" containerName="mariadb-account-create" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.039389 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5576d690-ea6f-4fdc-aef8-936a637dfb11" containerName="mariadb-account-create" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.040176 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.044629 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.044629 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zsclw" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.045127 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.048364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g67m4"] Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.134152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-combined-ca-bundle\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.134194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-scripts\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.134342 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n2qn\" (UniqueName: \"kubernetes.io/projected/c01de476-2104-4081-a3b4-3bc6710d0d1a-kube-api-access-6n2qn\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.134380 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-config-data\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.134408 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c01de476-2104-4081-a3b4-3bc6710d0d1a-etc-machine-id\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.134439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-db-sync-config-data\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.236537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-scripts\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.236869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-combined-ca-bundle\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.237157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n2qn\" (UniqueName: \"kubernetes.io/projected/c01de476-2104-4081-a3b4-3bc6710d0d1a-kube-api-access-6n2qn\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.237292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-config-data\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.237409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c01de476-2104-4081-a3b4-3bc6710d0d1a-etc-machine-id\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.237537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c01de476-2104-4081-a3b4-3bc6710d0d1a-etc-machine-id\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.237697 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-db-sync-config-data\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.246078 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-db-sync-config-data\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.246229 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-scripts\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.246251 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-combined-ca-bundle\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.246800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-config-data\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.253700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n2qn\" (UniqueName: \"kubernetes.io/projected/c01de476-2104-4081-a3b4-3bc6710d0d1a-kube-api-access-6n2qn\") pod \"cinder-db-sync-g67m4\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.367056 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:48 crc kubenswrapper[4735]: I1008 14:30:48.827426 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g67m4"] Oct 08 14:30:49 crc kubenswrapper[4735]: I1008 14:30:49.656457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g67m4" event={"ID":"c01de476-2104-4081-a3b4-3bc6710d0d1a","Type":"ContainerStarted","Data":"486c1fcf4c55ca53b748a7ee39a449db3fc7865cbd5460f7007ac6d5048850bb"} Oct 08 14:30:49 crc kubenswrapper[4735]: I1008 14:30:49.658250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g67m4" event={"ID":"c01de476-2104-4081-a3b4-3bc6710d0d1a","Type":"ContainerStarted","Data":"76755c1ea5b5bf3e369c75cc8c1307f4fdbf8d87b2d6e224610f692649258923"} Oct 08 14:30:49 crc kubenswrapper[4735]: I1008 14:30:49.674280 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g67m4" podStartSLOduration=1.67425694 podStartE2EDuration="1.67425694s" podCreationTimestamp="2025-10-08 14:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:30:49.671926191 +0000 UTC m=+5699.480230775" watchObservedRunningTime="2025-10-08 14:30:49.67425694 +0000 UTC m=+5699.482561484" Oct 08 14:30:52 crc kubenswrapper[4735]: I1008 14:30:52.686379 4735 generic.go:334] "Generic (PLEG): container finished" podID="c01de476-2104-4081-a3b4-3bc6710d0d1a" containerID="486c1fcf4c55ca53b748a7ee39a449db3fc7865cbd5460f7007ac6d5048850bb" exitCode=0 Oct 08 14:30:52 crc kubenswrapper[4735]: I1008 14:30:52.686465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g67m4" event={"ID":"c01de476-2104-4081-a3b4-3bc6710d0d1a","Type":"ContainerDied","Data":"486c1fcf4c55ca53b748a7ee39a449db3fc7865cbd5460f7007ac6d5048850bb"} Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.044445 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149050 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n2qn\" (UniqueName: \"kubernetes.io/projected/c01de476-2104-4081-a3b4-3bc6710d0d1a-kube-api-access-6n2qn\") pod \"c01de476-2104-4081-a3b4-3bc6710d0d1a\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149227 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c01de476-2104-4081-a3b4-3bc6710d0d1a-etc-machine-id\") pod \"c01de476-2104-4081-a3b4-3bc6710d0d1a\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149311 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-config-data\") pod \"c01de476-2104-4081-a3b4-3bc6710d0d1a\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149343 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-scripts\") pod \"c01de476-2104-4081-a3b4-3bc6710d0d1a\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149361 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-combined-ca-bundle\") pod \"c01de476-2104-4081-a3b4-3bc6710d0d1a\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149379 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-db-sync-config-data\") pod \"c01de476-2104-4081-a3b4-3bc6710d0d1a\" (UID: \"c01de476-2104-4081-a3b4-3bc6710d0d1a\") " Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149390 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c01de476-2104-4081-a3b4-3bc6710d0d1a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c01de476-2104-4081-a3b4-3bc6710d0d1a" (UID: "c01de476-2104-4081-a3b4-3bc6710d0d1a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.149759 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c01de476-2104-4081-a3b4-3bc6710d0d1a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.154261 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-scripts" (OuterVolumeSpecName: "scripts") pod "c01de476-2104-4081-a3b4-3bc6710d0d1a" (UID: "c01de476-2104-4081-a3b4-3bc6710d0d1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.154286 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c01de476-2104-4081-a3b4-3bc6710d0d1a" (UID: "c01de476-2104-4081-a3b4-3bc6710d0d1a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.155671 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01de476-2104-4081-a3b4-3bc6710d0d1a-kube-api-access-6n2qn" (OuterVolumeSpecName: "kube-api-access-6n2qn") pod "c01de476-2104-4081-a3b4-3bc6710d0d1a" (UID: "c01de476-2104-4081-a3b4-3bc6710d0d1a"). InnerVolumeSpecName "kube-api-access-6n2qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.182181 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c01de476-2104-4081-a3b4-3bc6710d0d1a" (UID: "c01de476-2104-4081-a3b4-3bc6710d0d1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.195413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-config-data" (OuterVolumeSpecName: "config-data") pod "c01de476-2104-4081-a3b4-3bc6710d0d1a" (UID: "c01de476-2104-4081-a3b4-3bc6710d0d1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.251758 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.251806 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.251820 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.251835 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c01de476-2104-4081-a3b4-3bc6710d0d1a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.251846 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n2qn\" (UniqueName: \"kubernetes.io/projected/c01de476-2104-4081-a3b4-3bc6710d0d1a-kube-api-access-6n2qn\") on node \"crc\" DevicePath \"\"" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.706544 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g67m4" event={"ID":"c01de476-2104-4081-a3b4-3bc6710d0d1a","Type":"ContainerDied","Data":"76755c1ea5b5bf3e369c75cc8c1307f4fdbf8d87b2d6e224610f692649258923"} Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.706591 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76755c1ea5b5bf3e369c75cc8c1307f4fdbf8d87b2d6e224610f692649258923" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.706630 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g67m4" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.997457 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfc4cdccc-plxwb"] Oct 08 14:30:54 crc kubenswrapper[4735]: E1008 14:30:54.997985 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01de476-2104-4081-a3b4-3bc6710d0d1a" containerName="cinder-db-sync" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.998007 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01de476-2104-4081-a3b4-3bc6710d0d1a" containerName="cinder-db-sync" Oct 08 14:30:54 crc kubenswrapper[4735]: I1008 14:30:54.998279 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01de476-2104-4081-a3b4-3bc6710d0d1a" containerName="cinder-db-sync" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.027217 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.052929 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfc4cdccc-plxwb"] Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.067517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdsrk\" (UniqueName: \"kubernetes.io/projected/b01de70e-d419-4718-9626-b1b3936427ff-kube-api-access-bdsrk\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.067608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-dns-svc\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.067720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.067867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.067939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-config\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.122950 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.126405 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.130048 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.130372 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.130529 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zsclw" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.130669 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.144224 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f53fca-b19a-4f27-a0fe-6debb5e9c063-logs\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169653 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data-custom\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169812 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-config\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmsl\" (UniqueName: \"kubernetes.io/projected/80f53fca-b19a-4f27-a0fe-6debb5e9c063-kube-api-access-nvmsl\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.169937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdsrk\" (UniqueName: \"kubernetes.io/projected/b01de70e-d419-4718-9626-b1b3936427ff-kube-api-access-bdsrk\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.170021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80f53fca-b19a-4f27-a0fe-6debb5e9c063-etc-machine-id\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.170076 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-scripts\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.170106 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-dns-svc\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.170858 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-config\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.170870 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.170969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-dns-svc\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.171663 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.211613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdsrk\" (UniqueName: \"kubernetes.io/projected/b01de70e-d419-4718-9626-b1b3936427ff-kube-api-access-bdsrk\") pod \"dnsmasq-dns-6cfc4cdccc-plxwb\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmsl\" (UniqueName: \"kubernetes.io/projected/80f53fca-b19a-4f27-a0fe-6debb5e9c063-kube-api-access-nvmsl\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271718 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80f53fca-b19a-4f27-a0fe-6debb5e9c063-etc-machine-id\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-scripts\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f53fca-b19a-4f27-a0fe-6debb5e9c063-logs\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80f53fca-b19a-4f27-a0fe-6debb5e9c063-etc-machine-id\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.271849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data-custom\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.272550 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f53fca-b19a-4f27-a0fe-6debb5e9c063-logs\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.275815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.275845 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data-custom\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.276420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.279040 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-scripts\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.293010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmsl\" (UniqueName: \"kubernetes.io/projected/80f53fca-b19a-4f27-a0fe-6debb5e9c063-kube-api-access-nvmsl\") pod \"cinder-api-0\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.366849 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.454743 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.913989 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfc4cdccc-plxwb"] Oct 08 14:30:55 crc kubenswrapper[4735]: W1008 14:30:55.916356 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01de70e_d419_4718_9626_b1b3936427ff.slice/crio-ff18386b89e89e395c892a14945b94931e5d8c6aeeb5dad6b3173be669569b6e WatchSource:0}: Error finding container ff18386b89e89e395c892a14945b94931e5d8c6aeeb5dad6b3173be669569b6e: Status 404 returned error can't find the container with id ff18386b89e89e395c892a14945b94931e5d8c6aeeb5dad6b3173be669569b6e Oct 08 14:30:55 crc kubenswrapper[4735]: I1008 14:30:55.962676 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:30:56 crc kubenswrapper[4735]: I1008 14:30:56.731786 4735 generic.go:334] "Generic (PLEG): container finished" podID="b01de70e-d419-4718-9626-b1b3936427ff" containerID="6a43159a1659f1fe9db82ae9abc184fa5c02bdc29a107d5958696fdcceb386a6" exitCode=0 Oct 08 14:30:56 crc kubenswrapper[4735]: I1008 14:30:56.731837 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" event={"ID":"b01de70e-d419-4718-9626-b1b3936427ff","Type":"ContainerDied","Data":"6a43159a1659f1fe9db82ae9abc184fa5c02bdc29a107d5958696fdcceb386a6"} Oct 08 14:30:56 crc kubenswrapper[4735]: I1008 14:30:56.732150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" event={"ID":"b01de70e-d419-4718-9626-b1b3936427ff","Type":"ContainerStarted","Data":"ff18386b89e89e395c892a14945b94931e5d8c6aeeb5dad6b3173be669569b6e"} Oct 08 14:30:56 crc kubenswrapper[4735]: I1008 14:30:56.739764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80f53fca-b19a-4f27-a0fe-6debb5e9c063","Type":"ContainerStarted","Data":"c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d"} Oct 08 14:30:56 crc kubenswrapper[4735]: I1008 14:30:56.739814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80f53fca-b19a-4f27-a0fe-6debb5e9c063","Type":"ContainerStarted","Data":"9906e6b67a275db40e2974e728c152232a09dffea9c3c04c26b1807b072d374e"} Oct 08 14:30:57 crc kubenswrapper[4735]: I1008 14:30:57.774116 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80f53fca-b19a-4f27-a0fe-6debb5e9c063","Type":"ContainerStarted","Data":"211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915"} Oct 08 14:30:57 crc kubenswrapper[4735]: I1008 14:30:57.774604 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 14:30:57 crc kubenswrapper[4735]: I1008 14:30:57.777106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" event={"ID":"b01de70e-d419-4718-9626-b1b3936427ff","Type":"ContainerStarted","Data":"676cc7221d5df50059f74df21ce11cc07f6e07a30a635e8ca4e138a35ecaf47b"} Oct 08 14:30:57 crc kubenswrapper[4735]: I1008 14:30:57.777252 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:30:57 crc kubenswrapper[4735]: I1008 14:30:57.797371 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.797347783 podStartE2EDuration="2.797347783s" podCreationTimestamp="2025-10-08 14:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:30:57.79058792 +0000 UTC m=+5707.598892484" watchObservedRunningTime="2025-10-08 14:30:57.797347783 +0000 UTC m=+5707.605652327" Oct 08 14:31:05 crc kubenswrapper[4735]: I1008 14:31:05.368697 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:31:05 crc kubenswrapper[4735]: I1008 14:31:05.399108 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" podStartSLOduration=11.399088805 podStartE2EDuration="11.399088805s" podCreationTimestamp="2025-10-08 14:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:30:57.822777794 +0000 UTC m=+5707.631082358" watchObservedRunningTime="2025-10-08 14:31:05.399088805 +0000 UTC m=+5715.207393349" Oct 08 14:31:05 crc kubenswrapper[4735]: I1008 14:31:05.457573 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54ffb5d785-9mlw8"] Oct 08 14:31:05 crc kubenswrapper[4735]: I1008 14:31:05.458286 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" podUID="478e165d-399c-4622-a891-707f27dfb3a9" containerName="dnsmasq-dns" containerID="cri-o://0d879d06773e030be831f4aac39631588adbde3549f50938d9d9ab91de66e4d7" gracePeriod=10 Oct 08 14:31:05 crc kubenswrapper[4735]: I1008 14:31:05.861485 4735 generic.go:334] "Generic (PLEG): container finished" podID="478e165d-399c-4622-a891-707f27dfb3a9" containerID="0d879d06773e030be831f4aac39631588adbde3549f50938d9d9ab91de66e4d7" exitCode=0 Oct 08 14:31:05 crc kubenswrapper[4735]: I1008 14:31:05.861783 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" event={"ID":"478e165d-399c-4622-a891-707f27dfb3a9","Type":"ContainerDied","Data":"0d879d06773e030be831f4aac39631588adbde3549f50938d9d9ab91de66e4d7"} Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.041559 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.205883 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-config\") pod \"478e165d-399c-4622-a891-707f27dfb3a9\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.205963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-dns-svc\") pod \"478e165d-399c-4622-a891-707f27dfb3a9\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.206156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-nb\") pod \"478e165d-399c-4622-a891-707f27dfb3a9\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.206203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-sb\") pod \"478e165d-399c-4622-a891-707f27dfb3a9\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.206285 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6frvb\" (UniqueName: \"kubernetes.io/projected/478e165d-399c-4622-a891-707f27dfb3a9-kube-api-access-6frvb\") pod \"478e165d-399c-4622-a891-707f27dfb3a9\" (UID: \"478e165d-399c-4622-a891-707f27dfb3a9\") " Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.228897 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478e165d-399c-4622-a891-707f27dfb3a9-kube-api-access-6frvb" (OuterVolumeSpecName: "kube-api-access-6frvb") pod "478e165d-399c-4622-a891-707f27dfb3a9" (UID: "478e165d-399c-4622-a891-707f27dfb3a9"). InnerVolumeSpecName "kube-api-access-6frvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.276304 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-config" (OuterVolumeSpecName: "config") pod "478e165d-399c-4622-a891-707f27dfb3a9" (UID: "478e165d-399c-4622-a891-707f27dfb3a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.294169 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "478e165d-399c-4622-a891-707f27dfb3a9" (UID: "478e165d-399c-4622-a891-707f27dfb3a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.307923 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6frvb\" (UniqueName: \"kubernetes.io/projected/478e165d-399c-4622-a891-707f27dfb3a9-kube-api-access-6frvb\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.307955 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.307964 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.309991 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "478e165d-399c-4622-a891-707f27dfb3a9" (UID: "478e165d-399c-4622-a891-707f27dfb3a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.314023 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "478e165d-399c-4622-a891-707f27dfb3a9" (UID: "478e165d-399c-4622-a891-707f27dfb3a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.409974 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.410014 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/478e165d-399c-4622-a891-707f27dfb3a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.874650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" event={"ID":"478e165d-399c-4622-a891-707f27dfb3a9","Type":"ContainerDied","Data":"59c3c8090fad98d31f17c9dafd0412f9dc58c11ee17c3b54104cafee1a883ef1"} Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.874710 4735 scope.go:117] "RemoveContainer" containerID="0d879d06773e030be831f4aac39631588adbde3549f50938d9d9ab91de66e4d7" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.874716 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffb5d785-9mlw8" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.925186 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54ffb5d785-9mlw8"] Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.928653 4735 scope.go:117] "RemoveContainer" containerID="2cfa2e7f5991883df2b2c37746c18be592eb125c9a18b7eb5e6f8f669a5ecb8e" Oct 08 14:31:06 crc kubenswrapper[4735]: I1008 14:31:06.946994 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54ffb5d785-9mlw8"] Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.617327 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.617931 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="64032e23-f67c-4a29-85b3-ccbc2d2145f0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.631671 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.631953 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-log" containerID="cri-o://da3ec38bf3b245be383933aa838dc187fcb6c6aceda4a8f0fd9e6648c90b5319" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.632456 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-metadata" containerID="cri-o://505dee2ebead61c62f6a9b4c56e795f6076a57ce69ecf346a6086b822c595a60" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.644663 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.645015 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="112e1c50-8472-4502-b6f0-a3896dd1cbf1" containerName="nova-scheduler-scheduler" containerID="cri-o://298137c915b8b2a1e9ee074e5de91a06c04946fe5cccb1de050e4339d69164b7" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.659209 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.659714 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" containerName="nova-cell0-conductor-conductor" containerID="cri-o://66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.675452 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.675744 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-log" containerID="cri-o://03fc76a902d3e6b1a3bff19dba3564242f38f6df73747bb40ec88b3cc530b4c9" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.675963 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-api" containerID="cri-o://d429c139b5d3f6c7679a3ef1d46c9ab5fa50c16c71cf50744808c1cd36391262" gracePeriod=30 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.886726 4735 generic.go:334] "Generic (PLEG): container finished" podID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerID="da3ec38bf3b245be383933aa838dc187fcb6c6aceda4a8f0fd9e6648c90b5319" exitCode=143 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.886795 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd74581a-fd2f-464b-ab89-9a7af5bfa073","Type":"ContainerDied","Data":"da3ec38bf3b245be383933aa838dc187fcb6c6aceda4a8f0fd9e6648c90b5319"} Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.889152 4735 generic.go:334] "Generic (PLEG): container finished" podID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerID="03fc76a902d3e6b1a3bff19dba3564242f38f6df73747bb40ec88b3cc530b4c9" exitCode=143 Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.889192 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61c0fe9a-51c8-480e-bcc6-2ade6e36890e","Type":"ContainerDied","Data":"03fc76a902d3e6b1a3bff19dba3564242f38f6df73747bb40ec88b3cc530b4c9"} Oct 08 14:31:07 crc kubenswrapper[4735]: I1008 14:31:07.909044 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.507281 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.653965 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-combined-ca-bundle\") pod \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.653997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-config-data\") pod \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.654099 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzl5t\" (UniqueName: \"kubernetes.io/projected/64032e23-f67c-4a29-85b3-ccbc2d2145f0-kube-api-access-bzl5t\") pod \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\" (UID: \"64032e23-f67c-4a29-85b3-ccbc2d2145f0\") " Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.660083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64032e23-f67c-4a29-85b3-ccbc2d2145f0-kube-api-access-bzl5t" (OuterVolumeSpecName: "kube-api-access-bzl5t") pod "64032e23-f67c-4a29-85b3-ccbc2d2145f0" (UID: "64032e23-f67c-4a29-85b3-ccbc2d2145f0"). InnerVolumeSpecName "kube-api-access-bzl5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.692796 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-config-data" (OuterVolumeSpecName: "config-data") pod "64032e23-f67c-4a29-85b3-ccbc2d2145f0" (UID: "64032e23-f67c-4a29-85b3-ccbc2d2145f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.694158 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64032e23-f67c-4a29-85b3-ccbc2d2145f0" (UID: "64032e23-f67c-4a29-85b3-ccbc2d2145f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:08 crc kubenswrapper[4735]: E1008 14:31:08.719345 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 14:31:08 crc kubenswrapper[4735]: E1008 14:31:08.721137 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 14:31:08 crc kubenswrapper[4735]: E1008 14:31:08.722593 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 14:31:08 crc kubenswrapper[4735]: E1008 14:31:08.722640 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" containerName="nova-cell0-conductor-conductor" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.757749 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.757784 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64032e23-f67c-4a29-85b3-ccbc2d2145f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.757793 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzl5t\" (UniqueName: \"kubernetes.io/projected/64032e23-f67c-4a29-85b3-ccbc2d2145f0-kube-api-access-bzl5t\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.902788 4735 generic.go:334] "Generic (PLEG): container finished" podID="112e1c50-8472-4502-b6f0-a3896dd1cbf1" containerID="298137c915b8b2a1e9ee074e5de91a06c04946fe5cccb1de050e4339d69164b7" exitCode=0 Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.904309 4735 generic.go:334] "Generic (PLEG): container finished" podID="64032e23-f67c-4a29-85b3-ccbc2d2145f0" containerID="c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc" exitCode=0 Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.904375 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.907518 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478e165d-399c-4622-a891-707f27dfb3a9" path="/var/lib/kubelet/pods/478e165d-399c-4622-a891-707f27dfb3a9/volumes" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.908224 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"112e1c50-8472-4502-b6f0-a3896dd1cbf1","Type":"ContainerDied","Data":"298137c915b8b2a1e9ee074e5de91a06c04946fe5cccb1de050e4339d69164b7"} Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.908254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"112e1c50-8472-4502-b6f0-a3896dd1cbf1","Type":"ContainerDied","Data":"4c09fbd06cac93171a3bb67cef011e68afe39cca9e00a79be6b57faa5bffc5d3"} Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.908265 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c09fbd06cac93171a3bb67cef011e68afe39cca9e00a79be6b57faa5bffc5d3" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.908282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64032e23-f67c-4a29-85b3-ccbc2d2145f0","Type":"ContainerDied","Data":"c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc"} Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.908303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64032e23-f67c-4a29-85b3-ccbc2d2145f0","Type":"ContainerDied","Data":"2d42b6c84d58a83b723bc0114b0cfefad111f15505dab5a6f963fd4d529e9aa3"} Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.908324 4735 scope.go:117] "RemoveContainer" containerID="c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.950904 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.963185 4735 scope.go:117] "RemoveContainer" containerID="c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc" Oct 08 14:31:08 crc kubenswrapper[4735]: E1008 14:31:08.964682 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc\": container with ID starting with c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc not found: ID does not exist" containerID="c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.964726 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc"} err="failed to get container status \"c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc\": rpc error: code = NotFound desc = could not find container \"c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc\": container with ID starting with c1bc7e01ea32369ff95f0f3a1a651202e9da304c83d2d3234d584081b50064cc not found: ID does not exist" Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.965871 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:31:08 crc kubenswrapper[4735]: I1008 14:31:08.984753 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.005536 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:31:09 crc kubenswrapper[4735]: E1008 14:31:09.006032 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478e165d-399c-4622-a891-707f27dfb3a9" containerName="dnsmasq-dns" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006056 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="478e165d-399c-4622-a891-707f27dfb3a9" containerName="dnsmasq-dns" Oct 08 14:31:09 crc kubenswrapper[4735]: E1008 14:31:09.006073 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112e1c50-8472-4502-b6f0-a3896dd1cbf1" containerName="nova-scheduler-scheduler" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006081 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="112e1c50-8472-4502-b6f0-a3896dd1cbf1" containerName="nova-scheduler-scheduler" Oct 08 14:31:09 crc kubenswrapper[4735]: E1008 14:31:09.006097 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478e165d-399c-4622-a891-707f27dfb3a9" containerName="init" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006103 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="478e165d-399c-4622-a891-707f27dfb3a9" containerName="init" Oct 08 14:31:09 crc kubenswrapper[4735]: E1008 14:31:09.006118 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64032e23-f67c-4a29-85b3-ccbc2d2145f0" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006123 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="64032e23-f67c-4a29-85b3-ccbc2d2145f0" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006329 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="478e165d-399c-4622-a891-707f27dfb3a9" containerName="dnsmasq-dns" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006346 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="64032e23-f67c-4a29-85b3-ccbc2d2145f0" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.006358 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="112e1c50-8472-4502-b6f0-a3896dd1cbf1" containerName="nova-scheduler-scheduler" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.007059 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.013985 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.014978 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.061901 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-config-data\") pod \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.062193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnm9f\" (UniqueName: \"kubernetes.io/projected/112e1c50-8472-4502-b6f0-a3896dd1cbf1-kube-api-access-qnm9f\") pod \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.062242 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-combined-ca-bundle\") pod \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\" (UID: \"112e1c50-8472-4502-b6f0-a3896dd1cbf1\") " Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.062648 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.062858 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2qz\" (UniqueName: \"kubernetes.io/projected/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-kube-api-access-gh2qz\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.062932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.067870 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112e1c50-8472-4502-b6f0-a3896dd1cbf1-kube-api-access-qnm9f" (OuterVolumeSpecName: "kube-api-access-qnm9f") pod "112e1c50-8472-4502-b6f0-a3896dd1cbf1" (UID: "112e1c50-8472-4502-b6f0-a3896dd1cbf1"). InnerVolumeSpecName "kube-api-access-qnm9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.105599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "112e1c50-8472-4502-b6f0-a3896dd1cbf1" (UID: "112e1c50-8472-4502-b6f0-a3896dd1cbf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.114669 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-config-data" (OuterVolumeSpecName: "config-data") pod "112e1c50-8472-4502-b6f0-a3896dd1cbf1" (UID: "112e1c50-8472-4502-b6f0-a3896dd1cbf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.165347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2qz\" (UniqueName: \"kubernetes.io/projected/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-kube-api-access-gh2qz\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.165440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.165573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.165633 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.165647 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnm9f\" (UniqueName: \"kubernetes.io/projected/112e1c50-8472-4502-b6f0-a3896dd1cbf1-kube-api-access-qnm9f\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.165660 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e1c50-8472-4502-b6f0-a3896dd1cbf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.175246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.179230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.205110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2qz\" (UniqueName: \"kubernetes.io/projected/799ff5fc-80ce-4ac8-8296-e133c2d2f1d7-kube-api-access-gh2qz\") pod \"nova-cell1-novncproxy-0\" (UID: \"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.340583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.817101 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 14:31:09 crc kubenswrapper[4735]: W1008 14:31:09.824262 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod799ff5fc_80ce_4ac8_8296_e133c2d2f1d7.slice/crio-d1b4e72dadb57210d1f5e5eb8892cca255c6f6836514bc76fc9a96a7bf73173d WatchSource:0}: Error finding container d1b4e72dadb57210d1f5e5eb8892cca255c6f6836514bc76fc9a96a7bf73173d: Status 404 returned error can't find the container with id d1b4e72dadb57210d1f5e5eb8892cca255c6f6836514bc76fc9a96a7bf73173d Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.930235 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.930226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7","Type":"ContainerStarted","Data":"d1b4e72dadb57210d1f5e5eb8892cca255c6f6836514bc76fc9a96a7bf73173d"} Oct 08 14:31:09 crc kubenswrapper[4735]: I1008 14:31:09.992044 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.007859 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.019376 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.020950 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.023321 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.027609 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.081616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkrk2\" (UniqueName: \"kubernetes.io/projected/17137a92-fc98-4065-b44d-ce076e9b69a6-kube-api-access-xkrk2\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.081688 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17137a92-fc98-4065-b44d-ce076e9b69a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.081716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17137a92-fc98-4065-b44d-ce076e9b69a6-config-data\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.183311 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrk2\" (UniqueName: \"kubernetes.io/projected/17137a92-fc98-4065-b44d-ce076e9b69a6-kube-api-access-xkrk2\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.183751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17137a92-fc98-4065-b44d-ce076e9b69a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.183938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17137a92-fc98-4065-b44d-ce076e9b69a6-config-data\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.201861 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17137a92-fc98-4065-b44d-ce076e9b69a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.201903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17137a92-fc98-4065-b44d-ce076e9b69a6-config-data\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.205577 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrk2\" (UniqueName: \"kubernetes.io/projected/17137a92-fc98-4065-b44d-ce076e9b69a6-kube-api-access-xkrk2\") pod \"nova-scheduler-0\" (UID: \"17137a92-fc98-4065-b44d-ce076e9b69a6\") " pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.340611 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.809242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.917227 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112e1c50-8472-4502-b6f0-a3896dd1cbf1" path="/var/lib/kubelet/pods/112e1c50-8472-4502-b6f0-a3896dd1cbf1/volumes" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.917913 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64032e23-f67c-4a29-85b3-ccbc2d2145f0" path="/var/lib/kubelet/pods/64032e23-f67c-4a29-85b3-ccbc2d2145f0/volumes" Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.920904 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.921168 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="0c24bffc-5d3a-485a-8358-23a30149782b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" gracePeriod=30 Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.952134 4735 generic.go:334] "Generic (PLEG): container finished" podID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerID="505dee2ebead61c62f6a9b4c56e795f6076a57ce69ecf346a6086b822c595a60" exitCode=0 Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.952239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd74581a-fd2f-464b-ab89-9a7af5bfa073","Type":"ContainerDied","Data":"505dee2ebead61c62f6a9b4c56e795f6076a57ce69ecf346a6086b822c595a60"} Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.953590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17137a92-fc98-4065-b44d-ce076e9b69a6","Type":"ContainerStarted","Data":"1faf9b75e99d8726d3d48b3ee72526c14ed735e5b67e6060db5f21a37e622221"} Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.954874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"799ff5fc-80ce-4ac8-8296-e133c2d2f1d7","Type":"ContainerStarted","Data":"f414ac8ddc28c93aa742300a0a1c1e7d24a45d11490fbe9c577480e7f0d55b0a"} Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.957706 4735 generic.go:334] "Generic (PLEG): container finished" podID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerID="d429c139b5d3f6c7679a3ef1d46c9ab5fa50c16c71cf50744808c1cd36391262" exitCode=0 Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.957751 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61c0fe9a-51c8-480e-bcc6-2ade6e36890e","Type":"ContainerDied","Data":"d429c139b5d3f6c7679a3ef1d46c9ab5fa50c16c71cf50744808c1cd36391262"} Oct 08 14:31:10 crc kubenswrapper[4735]: I1008 14:31:10.995831 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.995811833 podStartE2EDuration="2.995811833s" podCreationTimestamp="2025-10-08 14:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:10.988194828 +0000 UTC m=+5720.796499382" watchObservedRunningTime="2025-10-08 14:31:10.995811833 +0000 UTC m=+5720.804116377" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.312443 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.348754 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412129 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-combined-ca-bundle\") pod \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412240 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhqm\" (UniqueName: \"kubernetes.io/projected/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-kube-api-access-kvhqm\") pod \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-config-data\") pod \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412343 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-combined-ca-bundle\") pod \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-logs\") pod \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412436 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkl2\" (UniqueName: \"kubernetes.io/projected/dd74581a-fd2f-464b-ab89-9a7af5bfa073-kube-api-access-slkl2\") pod \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412628 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-config-data\") pod \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\" (UID: \"61c0fe9a-51c8-480e-bcc6-2ade6e36890e\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.412658 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd74581a-fd2f-464b-ab89-9a7af5bfa073-logs\") pod \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\" (UID: \"dd74581a-fd2f-464b-ab89-9a7af5bfa073\") " Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.413548 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd74581a-fd2f-464b-ab89-9a7af5bfa073-logs" (OuterVolumeSpecName: "logs") pod "dd74581a-fd2f-464b-ab89-9a7af5bfa073" (UID: "dd74581a-fd2f-464b-ab89-9a7af5bfa073"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.413983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-logs" (OuterVolumeSpecName: "logs") pod "61c0fe9a-51c8-480e-bcc6-2ade6e36890e" (UID: "61c0fe9a-51c8-480e-bcc6-2ade6e36890e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.423127 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-kube-api-access-kvhqm" (OuterVolumeSpecName: "kube-api-access-kvhqm") pod "61c0fe9a-51c8-480e-bcc6-2ade6e36890e" (UID: "61c0fe9a-51c8-480e-bcc6-2ade6e36890e"). InnerVolumeSpecName "kube-api-access-kvhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.425143 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd74581a-fd2f-464b-ab89-9a7af5bfa073-kube-api-access-slkl2" (OuterVolumeSpecName: "kube-api-access-slkl2") pod "dd74581a-fd2f-464b-ab89-9a7af5bfa073" (UID: "dd74581a-fd2f-464b-ab89-9a7af5bfa073"). InnerVolumeSpecName "kube-api-access-slkl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.445752 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61c0fe9a-51c8-480e-bcc6-2ade6e36890e" (UID: "61c0fe9a-51c8-480e-bcc6-2ade6e36890e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.449277 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-config-data" (OuterVolumeSpecName: "config-data") pod "61c0fe9a-51c8-480e-bcc6-2ade6e36890e" (UID: "61c0fe9a-51c8-480e-bcc6-2ade6e36890e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.454893 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-config-data" (OuterVolumeSpecName: "config-data") pod "dd74581a-fd2f-464b-ab89-9a7af5bfa073" (UID: "dd74581a-fd2f-464b-ab89-9a7af5bfa073"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.465588 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd74581a-fd2f-464b-ab89-9a7af5bfa073" (UID: "dd74581a-fd2f-464b-ab89-9a7af5bfa073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514795 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514828 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd74581a-fd2f-464b-ab89-9a7af5bfa073-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514841 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514854 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvhqm\" (UniqueName: \"kubernetes.io/projected/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-kube-api-access-kvhqm\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514863 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd74581a-fd2f-464b-ab89-9a7af5bfa073-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514872 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514880 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c0fe9a-51c8-480e-bcc6-2ade6e36890e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.514888 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slkl2\" (UniqueName: \"kubernetes.io/projected/dd74581a-fd2f-464b-ab89-9a7af5bfa073-kube-api-access-slkl2\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.979714 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.979706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61c0fe9a-51c8-480e-bcc6-2ade6e36890e","Type":"ContainerDied","Data":"f3d0e19decdda44db132cf3a302c4b2f03cd97f4945e803469db93627694a5a6"} Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.980132 4735 scope.go:117] "RemoveContainer" containerID="d429c139b5d3f6c7679a3ef1d46c9ab5fa50c16c71cf50744808c1cd36391262" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.981676 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd74581a-fd2f-464b-ab89-9a7af5bfa073","Type":"ContainerDied","Data":"e9b181c6154f70869dfb9eaddbea750568c4220895a406aecf7b6e5ee7896bf2"} Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.981680 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:31:11 crc kubenswrapper[4735]: I1008 14:31:11.983668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17137a92-fc98-4065-b44d-ce076e9b69a6","Type":"ContainerStarted","Data":"35158a3309584ee6d574fff4dbb09b4b7351a878447ba39b54813f4499898649"} Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.022524 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.022502424 podStartE2EDuration="3.022502424s" podCreationTimestamp="2025-10-08 14:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:12.021803616 +0000 UTC m=+5721.830108150" watchObservedRunningTime="2025-10-08 14:31:12.022502424 +0000 UTC m=+5721.830806968" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.029271 4735 scope.go:117] "RemoveContainer" containerID="03fc76a902d3e6b1a3bff19dba3564242f38f6df73747bb40ec88b3cc530b4c9" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.113647 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.126240 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.135851 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.142620 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.151629 4735 scope.go:117] "RemoveContainer" containerID="505dee2ebead61c62f6a9b4c56e795f6076a57ce69ecf346a6086b822c595a60" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.169955 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: E1008 14:31:12.170289 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-log" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.170305 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-log" Oct 08 14:31:12 crc kubenswrapper[4735]: E1008 14:31:12.170324 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-log" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.170332 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-log" Oct 08 14:31:12 crc kubenswrapper[4735]: E1008 14:31:12.170350 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-metadata" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.170356 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-metadata" Oct 08 14:31:12 crc kubenswrapper[4735]: E1008 14:31:12.170371 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-api" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.170377 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-api" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.175278 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-log" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.175302 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-log" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.175325 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" containerName="nova-api-api" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.175341 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" containerName="nova-metadata-metadata" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.176317 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.181878 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.217575 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.220382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.230007 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb09392d-77b7-4a05-99f7-dec7aa054361-logs\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.230143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb09392d-77b7-4a05-99f7-dec7aa054361-config-data\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.230196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09392d-77b7-4a05-99f7-dec7aa054361-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.230261 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5wxc\" (UniqueName: \"kubernetes.io/projected/cb09392d-77b7-4a05-99f7-dec7aa054361-kube-api-access-h5wxc\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.230911 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.250499 4735 scope.go:117] "RemoveContainer" containerID="da3ec38bf3b245be383933aa838dc187fcb6c6aceda4a8f0fd9e6648c90b5319" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.262242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.291503 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a498d75-0f33-45a0-845f-5e0b659dd821-config-data\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb09392d-77b7-4a05-99f7-dec7aa054361-config-data\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332621 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09392d-77b7-4a05-99f7-dec7aa054361-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332638 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a498d75-0f33-45a0-845f-5e0b659dd821-logs\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a498d75-0f33-45a0-845f-5e0b659dd821-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5wxc\" (UniqueName: \"kubernetes.io/projected/cb09392d-77b7-4a05-99f7-dec7aa054361-kube-api-access-h5wxc\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfg7v\" (UniqueName: \"kubernetes.io/projected/2a498d75-0f33-45a0-845f-5e0b659dd821-kube-api-access-lfg7v\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.332738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb09392d-77b7-4a05-99f7-dec7aa054361-logs\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.333124 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb09392d-77b7-4a05-99f7-dec7aa054361-logs\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.344251 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb09392d-77b7-4a05-99f7-dec7aa054361-config-data\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.345358 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09392d-77b7-4a05-99f7-dec7aa054361-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.357628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5wxc\" (UniqueName: \"kubernetes.io/projected/cb09392d-77b7-4a05-99f7-dec7aa054361-kube-api-access-h5wxc\") pod \"nova-metadata-0\" (UID: \"cb09392d-77b7-4a05-99f7-dec7aa054361\") " pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.434569 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a498d75-0f33-45a0-845f-5e0b659dd821-config-data\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.434672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a498d75-0f33-45a0-845f-5e0b659dd821-logs\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.434729 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a498d75-0f33-45a0-845f-5e0b659dd821-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.434768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfg7v\" (UniqueName: \"kubernetes.io/projected/2a498d75-0f33-45a0-845f-5e0b659dd821-kube-api-access-lfg7v\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.436600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a498d75-0f33-45a0-845f-5e0b659dd821-logs\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.439189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a498d75-0f33-45a0-845f-5e0b659dd821-config-data\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.439810 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a498d75-0f33-45a0-845f-5e0b659dd821-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.485066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfg7v\" (UniqueName: \"kubernetes.io/projected/2a498d75-0f33-45a0-845f-5e0b659dd821-kube-api-access-lfg7v\") pod \"nova-api-0\" (UID: \"2a498d75-0f33-45a0-845f-5e0b659dd821\") " pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.539060 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.563414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.912239 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61c0fe9a-51c8-480e-bcc6-2ade6e36890e" path="/var/lib/kubelet/pods/61c0fe9a-51c8-480e-bcc6-2ade6e36890e/volumes" Oct 08 14:31:12 crc kubenswrapper[4735]: I1008 14:31:12.914533 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd74581a-fd2f-464b-ab89-9a7af5bfa073" path="/var/lib/kubelet/pods/dd74581a-fd2f-464b-ab89-9a7af5bfa073/volumes" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.019186 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" containerID="66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452" exitCode=0 Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.019269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba01536e-ef13-4daf-a7a2-8405ee0bef12","Type":"ContainerDied","Data":"66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452"} Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.083134 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 14:31:13 crc kubenswrapper[4735]: E1008 14:31:13.122791 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 14:31:13 crc kubenswrapper[4735]: E1008 14:31:13.128679 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.139315 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 14:31:13 crc kubenswrapper[4735]: E1008 14:31:13.145656 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 14:31:13 crc kubenswrapper[4735]: E1008 14:31:13.145734 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="0c24bffc-5d3a-485a-8358-23a30149782b" containerName="nova-cell1-conductor-conductor" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.368860 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.454805 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-config-data\") pod \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.454942 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-combined-ca-bundle\") pod \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.455006 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/ba01536e-ef13-4daf-a7a2-8405ee0bef12-kube-api-access-wdknf\") pod \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\" (UID: \"ba01536e-ef13-4daf-a7a2-8405ee0bef12\") " Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.462723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba01536e-ef13-4daf-a7a2-8405ee0bef12-kube-api-access-wdknf" (OuterVolumeSpecName: "kube-api-access-wdknf") pod "ba01536e-ef13-4daf-a7a2-8405ee0bef12" (UID: "ba01536e-ef13-4daf-a7a2-8405ee0bef12"). InnerVolumeSpecName "kube-api-access-wdknf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.496042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba01536e-ef13-4daf-a7a2-8405ee0bef12" (UID: "ba01536e-ef13-4daf-a7a2-8405ee0bef12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.497821 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-config-data" (OuterVolumeSpecName: "config-data") pod "ba01536e-ef13-4daf-a7a2-8405ee0bef12" (UID: "ba01536e-ef13-4daf-a7a2-8405ee0bef12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.556991 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.557040 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/ba01536e-ef13-4daf-a7a2-8405ee0bef12-kube-api-access-wdknf\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:13 crc kubenswrapper[4735]: I1008 14:31:13.557053 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba01536e-ef13-4daf-a7a2-8405ee0bef12-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.033599 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb09392d-77b7-4a05-99f7-dec7aa054361","Type":"ContainerStarted","Data":"4efa4a32c0465a3d4e59dd4b1574f0f3d30c7a2d26b9556774c0af68538c5182"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.033926 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb09392d-77b7-4a05-99f7-dec7aa054361","Type":"ContainerStarted","Data":"b5d2bc10f8fb769401b9a08d4823f1d8003eb3df5936b9a6868827e5ccd88f7d"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.033938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb09392d-77b7-4a05-99f7-dec7aa054361","Type":"ContainerStarted","Data":"1bb9461878a54db9dcb9d310d1bab8b559afceb9a9749e75a07c4c8c53fe5d43"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.035678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba01536e-ef13-4daf-a7a2-8405ee0bef12","Type":"ContainerDied","Data":"3c2dd9a81c0e4ba67036d5009fceb6ed99477dd1b803877f02c63b341d2243b4"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.035696 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.035730 4735 scope.go:117] "RemoveContainer" containerID="66b8bfd99e6b442bb6d1b99758ed5461e8abf5762a1f9656067bc42332198452" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.042157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a498d75-0f33-45a0-845f-5e0b659dd821","Type":"ContainerStarted","Data":"b5ff34be5e4f3d90afc298026af63b785213a8e36dcc381644c2ff8bb165f5ff"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.042195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a498d75-0f33-45a0-845f-5e0b659dd821","Type":"ContainerStarted","Data":"12761247400a2b571664283dde8434822ff0bc66735a44f46a98715dcce46fa4"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.042206 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a498d75-0f33-45a0-845f-5e0b659dd821","Type":"ContainerStarted","Data":"d9f2306e73dcb2c162ceb3ef44366c534bdb8d9d0c3dc48fd6b292e273ea6957"} Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.061395 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.061375544 podStartE2EDuration="2.061375544s" podCreationTimestamp="2025-10-08 14:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:14.056668574 +0000 UTC m=+5723.864973128" watchObservedRunningTime="2025-10-08 14:31:14.061375544 +0000 UTC m=+5723.869680088" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.084702 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.08465085 podStartE2EDuration="2.08465085s" podCreationTimestamp="2025-10-08 14:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:14.079869878 +0000 UTC m=+5723.888174432" watchObservedRunningTime="2025-10-08 14:31:14.08465085 +0000 UTC m=+5723.892955404" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.101365 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.109099 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.118205 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:31:14 crc kubenswrapper[4735]: E1008 14:31:14.118572 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" containerName="nova-cell0-conductor-conductor" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.118587 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" containerName="nova-cell0-conductor-conductor" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.118773 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" containerName="nova-cell0-conductor-conductor" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.119351 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.121132 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.127855 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.168746 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngkn\" (UniqueName: \"kubernetes.io/projected/a01d711d-4d74-4227-981c-1b44deaa2823-kube-api-access-gngkn\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.168932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a01d711d-4d74-4227-981c-1b44deaa2823-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.168982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01d711d-4d74-4227-981c-1b44deaa2823-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.271076 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01d711d-4d74-4227-981c-1b44deaa2823-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.271147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngkn\" (UniqueName: \"kubernetes.io/projected/a01d711d-4d74-4227-981c-1b44deaa2823-kube-api-access-gngkn\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.271296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a01d711d-4d74-4227-981c-1b44deaa2823-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.275153 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a01d711d-4d74-4227-981c-1b44deaa2823-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.275739 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01d711d-4d74-4227-981c-1b44deaa2823-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.290127 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngkn\" (UniqueName: \"kubernetes.io/projected/a01d711d-4d74-4227-981c-1b44deaa2823-kube-api-access-gngkn\") pod \"nova-cell0-conductor-0\" (UID: \"a01d711d-4d74-4227-981c-1b44deaa2823\") " pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.341688 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.437892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.868957 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 14:31:14 crc kubenswrapper[4735]: W1008 14:31:14.875432 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01d711d_4d74_4227_981c_1b44deaa2823.slice/crio-1e16236dd3485e944ae0914e1eb79debbd67e8fc478c5b805f2b10e511a54da3 WatchSource:0}: Error finding container 1e16236dd3485e944ae0914e1eb79debbd67e8fc478c5b805f2b10e511a54da3: Status 404 returned error can't find the container with id 1e16236dd3485e944ae0914e1eb79debbd67e8fc478c5b805f2b10e511a54da3 Oct 08 14:31:14 crc kubenswrapper[4735]: I1008 14:31:14.904042 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba01536e-ef13-4daf-a7a2-8405ee0bef12" path="/var/lib/kubelet/pods/ba01536e-ef13-4daf-a7a2-8405ee0bef12/volumes" Oct 08 14:31:15 crc kubenswrapper[4735]: I1008 14:31:15.064156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a01d711d-4d74-4227-981c-1b44deaa2823","Type":"ContainerStarted","Data":"1e16236dd3485e944ae0914e1eb79debbd67e8fc478c5b805f2b10e511a54da3"} Oct 08 14:31:15 crc kubenswrapper[4735]: I1008 14:31:15.341739 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 14:31:16 crc kubenswrapper[4735]: I1008 14:31:16.075741 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a01d711d-4d74-4227-981c-1b44deaa2823","Type":"ContainerStarted","Data":"7daab2f7d553902c7b3419325124dde542ea91b919938e9e8c6c11dc122ecca5"} Oct 08 14:31:16 crc kubenswrapper[4735]: I1008 14:31:16.076146 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:16 crc kubenswrapper[4735]: I1008 14:31:16.093253 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.093237206 podStartE2EDuration="2.093237206s" podCreationTimestamp="2025-10-08 14:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:16.092855136 +0000 UTC m=+5725.901159700" watchObservedRunningTime="2025-10-08 14:31:16.093237206 +0000 UTC m=+5725.901541750" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.489010 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.539982 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.540027 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.554778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-config-data\") pod \"0c24bffc-5d3a-485a-8358-23a30149782b\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.555754 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng7hz\" (UniqueName: \"kubernetes.io/projected/0c24bffc-5d3a-485a-8358-23a30149782b-kube-api-access-ng7hz\") pod \"0c24bffc-5d3a-485a-8358-23a30149782b\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.555909 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-combined-ca-bundle\") pod \"0c24bffc-5d3a-485a-8358-23a30149782b\" (UID: \"0c24bffc-5d3a-485a-8358-23a30149782b\") " Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.561507 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c24bffc-5d3a-485a-8358-23a30149782b-kube-api-access-ng7hz" (OuterVolumeSpecName: "kube-api-access-ng7hz") pod "0c24bffc-5d3a-485a-8358-23a30149782b" (UID: "0c24bffc-5d3a-485a-8358-23a30149782b"). InnerVolumeSpecName "kube-api-access-ng7hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.596210 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c24bffc-5d3a-485a-8358-23a30149782b" (UID: "0c24bffc-5d3a-485a-8358-23a30149782b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.629614 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-config-data" (OuterVolumeSpecName: "config-data") pod "0c24bffc-5d3a-485a-8358-23a30149782b" (UID: "0c24bffc-5d3a-485a-8358-23a30149782b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.657866 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.657896 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng7hz\" (UniqueName: \"kubernetes.io/projected/0c24bffc-5d3a-485a-8358-23a30149782b-kube-api-access-ng7hz\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:17 crc kubenswrapper[4735]: I1008 14:31:17.657910 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c24bffc-5d3a-485a-8358-23a30149782b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.092825 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c24bffc-5d3a-485a-8358-23a30149782b" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" exitCode=0 Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.092886 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.092897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c24bffc-5d3a-485a-8358-23a30149782b","Type":"ContainerDied","Data":"400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7"} Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.092943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c24bffc-5d3a-485a-8358-23a30149782b","Type":"ContainerDied","Data":"83321369e60e7732f340945c3f9c466f0901fc468169b58e914257f6e57f0c5c"} Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.092969 4735 scope.go:117] "RemoveContainer" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.117980 4735 scope.go:117] "RemoveContainer" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" Oct 08 14:31:18 crc kubenswrapper[4735]: E1008 14:31:18.118417 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7\": container with ID starting with 400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7 not found: ID does not exist" containerID="400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.118460 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7"} err="failed to get container status \"400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7\": rpc error: code = NotFound desc = could not find container \"400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7\": container with ID starting with 400d48daf969739a5708592b7fe3f566ea0c9ed5621d7efc758a3e0e4297a9d7 not found: ID does not exist" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.124079 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.131405 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.175050 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:31:18 crc kubenswrapper[4735]: E1008 14:31:18.175572 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c24bffc-5d3a-485a-8358-23a30149782b" containerName="nova-cell1-conductor-conductor" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.175596 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c24bffc-5d3a-485a-8358-23a30149782b" containerName="nova-cell1-conductor-conductor" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.175854 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c24bffc-5d3a-485a-8358-23a30149782b" containerName="nova-cell1-conductor-conductor" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.176623 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.179100 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.186398 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.267020 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.267120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksktv\" (UniqueName: \"kubernetes.io/projected/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-kube-api-access-ksktv\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.267203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.368386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.368470 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksktv\" (UniqueName: \"kubernetes.io/projected/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-kube-api-access-ksktv\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.368543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.374497 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.375547 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.385841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksktv\" (UniqueName: \"kubernetes.io/projected/1824d43f-2acc-4235-a6fd-5a985f2ba4cc-kube-api-access-ksktv\") pod \"nova-cell1-conductor-0\" (UID: \"1824d43f-2acc-4235-a6fd-5a985f2ba4cc\") " pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.494768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.906693 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c24bffc-5d3a-485a-8358-23a30149782b" path="/var/lib/kubelet/pods/0c24bffc-5d3a-485a-8358-23a30149782b/volumes" Oct 08 14:31:18 crc kubenswrapper[4735]: I1008 14:31:18.961469 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 14:31:18 crc kubenswrapper[4735]: W1008 14:31:18.966081 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1824d43f_2acc_4235_a6fd_5a985f2ba4cc.slice/crio-2cf3838149b123776ef4b1d84aed3a4939b1c501321edc152e58a06f023e60f6 WatchSource:0}: Error finding container 2cf3838149b123776ef4b1d84aed3a4939b1c501321edc152e58a06f023e60f6: Status 404 returned error can't find the container with id 2cf3838149b123776ef4b1d84aed3a4939b1c501321edc152e58a06f023e60f6 Oct 08 14:31:19 crc kubenswrapper[4735]: I1008 14:31:19.102156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1824d43f-2acc-4235-a6fd-5a985f2ba4cc","Type":"ContainerStarted","Data":"2cf3838149b123776ef4b1d84aed3a4939b1c501321edc152e58a06f023e60f6"} Oct 08 14:31:19 crc kubenswrapper[4735]: I1008 14:31:19.341459 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:19 crc kubenswrapper[4735]: I1008 14:31:19.352279 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:20 crc kubenswrapper[4735]: I1008 14:31:20.113739 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1824d43f-2acc-4235-a6fd-5a985f2ba4cc","Type":"ContainerStarted","Data":"17126566d5230176b8fc2aa4a2872639ab225b37bb4f90478e814f78374d8eb6"} Oct 08 14:31:20 crc kubenswrapper[4735]: I1008 14:31:20.113984 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:20 crc kubenswrapper[4735]: I1008 14:31:20.121375 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 14:31:20 crc kubenswrapper[4735]: I1008 14:31:20.133638 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.13362132 podStartE2EDuration="2.13362132s" podCreationTimestamp="2025-10-08 14:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:20.126172989 +0000 UTC m=+5729.934477543" watchObservedRunningTime="2025-10-08 14:31:20.13362132 +0000 UTC m=+5729.941925864" Oct 08 14:31:20 crc kubenswrapper[4735]: I1008 14:31:20.341223 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 14:31:20 crc kubenswrapper[4735]: I1008 14:31:20.375008 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 14:31:21 crc kubenswrapper[4735]: I1008 14:31:21.149333 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 14:31:22 crc kubenswrapper[4735]: I1008 14:31:22.543282 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 14:31:22 crc kubenswrapper[4735]: I1008 14:31:22.543339 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 14:31:22 crc kubenswrapper[4735]: I1008 14:31:22.564455 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 14:31:22 crc kubenswrapper[4735]: I1008 14:31:22.565511 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 14:31:23 crc kubenswrapper[4735]: I1008 14:31:23.663706 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a498d75-0f33-45a0-845f-5e0b659dd821" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:31:23 crc kubenswrapper[4735]: I1008 14:31:23.704652 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cb09392d-77b7-4a05-99f7-dec7aa054361" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:31:23 crc kubenswrapper[4735]: I1008 14:31:23.704660 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a498d75-0f33-45a0-845f-5e0b659dd821" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:31:23 crc kubenswrapper[4735]: I1008 14:31:23.704660 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cb09392d-77b7-4a05-99f7-dec7aa054361" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 14:31:24 crc kubenswrapper[4735]: I1008 14:31:24.466144 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 14:31:24 crc kubenswrapper[4735]: I1008 14:31:24.878121 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:31:24 crc kubenswrapper[4735]: I1008 14:31:24.878179 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.021665 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.024064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.026880 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.049458 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.143147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.143231 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-452ff\" (UniqueName: \"kubernetes.io/projected/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-kube-api-access-452ff\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.143280 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.143310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.143376 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.143427 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-scripts\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-scripts\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245430 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245489 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-452ff\" (UniqueName: \"kubernetes.io/projected/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-kube-api-access-452ff\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245568 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245586 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.245647 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.251331 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.251838 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.253044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-scripts\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.253161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.262079 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-452ff\" (UniqueName: \"kubernetes.io/projected/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-kube-api-access-452ff\") pod \"cinder-scheduler-0\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.347981 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 14:31:27 crc kubenswrapper[4735]: I1008 14:31:27.829351 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:27 crc kubenswrapper[4735]: W1008 14:31:27.831392 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ac301d_ca51_4091_a1b2_6f86dce8b9e8.slice/crio-1a4b1c8299128c4cf0ffdec3112d3cf5f144de0135744b0356c33bb5b457a359 WatchSource:0}: Error finding container 1a4b1c8299128c4cf0ffdec3112d3cf5f144de0135744b0356c33bb5b457a359: Status 404 returned error can't find the container with id 1a4b1c8299128c4cf0ffdec3112d3cf5f144de0135744b0356c33bb5b457a359 Oct 08 14:31:28 crc kubenswrapper[4735]: I1008 14:31:28.208472 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21ac301d-ca51-4091-a1b2-6f86dce8b9e8","Type":"ContainerStarted","Data":"1a4b1c8299128c4cf0ffdec3112d3cf5f144de0135744b0356c33bb5b457a359"} Oct 08 14:31:28 crc kubenswrapper[4735]: I1008 14:31:28.529985 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 14:31:28 crc kubenswrapper[4735]: I1008 14:31:28.554191 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:31:28 crc kubenswrapper[4735]: I1008 14:31:28.554477 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api-log" containerID="cri-o://c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d" gracePeriod=30 Oct 08 14:31:28 crc kubenswrapper[4735]: I1008 14:31:28.554736 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api" containerID="cri-o://211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915" gracePeriod=30 Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.202374 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.204625 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.210179 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.218281 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.249471 4735 generic.go:334] "Generic (PLEG): container finished" podID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerID="c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d" exitCode=143 Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.249531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80f53fca-b19a-4f27-a0fe-6debb5e9c063","Type":"ContainerDied","Data":"c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d"} Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.251997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21ac301d-ca51-4091-a1b2-6f86dce8b9e8","Type":"ContainerStarted","Data":"b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1"} Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.252042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21ac301d-ca51-4091-a1b2-6f86dce8b9e8","Type":"ContainerStarted","Data":"b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4"} Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287353 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287376 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287463 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287489 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287551 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9lw\" (UniqueName: \"kubernetes.io/projected/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-kube-api-access-rn9lw\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-run\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287776 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.287884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.288788 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.288775819 podStartE2EDuration="3.288775819s" podCreationTimestamp="2025-10-08 14:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:29.281087963 +0000 UTC m=+5739.089392507" watchObservedRunningTime="2025-10-08 14:31:29.288775819 +0000 UTC m=+5739.097080363" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389654 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389736 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-run\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389824 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389871 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389953 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389967 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.389997 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9lw\" (UniqueName: \"kubernetes.io/projected/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-kube-api-access-rn9lw\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390171 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390205 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-run\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390735 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390786 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390813 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.390840 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.391740 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.392038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.392936 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.392981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.395449 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.396409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.397291 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.398095 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.399359 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.415193 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9lw\" (UniqueName: \"kubernetes.io/projected/d87f7873-ab8a-4ac5-ab3f-f3613f8dec25-kube-api-access-rn9lw\") pod \"cinder-volume-volume1-0\" (UID: \"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25\") " pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.534166 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.764708 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.771911 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.778665 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.817293 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.902340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.903549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-lib-modules\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.903656 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-scripts\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.903701 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.903722 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8afe5a8e-fa52-4d98-a549-284ff784f779-ceph\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.903791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl9t8\" (UniqueName: \"kubernetes.io/projected/8afe5a8e-fa52-4d98-a549-284ff784f779-kube-api-access-vl9t8\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.903870 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904020 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904090 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-dev\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904277 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-sys\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-config-data\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.904379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.905520 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-run\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.905667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:29 crc kubenswrapper[4735]: I1008 14:31:29.988211 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007240 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-dev\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007401 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-sys\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007563 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-config-data\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007591 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-run\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007783 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007833 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-lib-modules\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007876 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-scripts\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.007956 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8afe5a8e-fa52-4d98-a549-284ff784f779-ceph\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.008006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl9t8\" (UniqueName: \"kubernetes.io/projected/8afe5a8e-fa52-4d98-a549-284ff784f779-kube-api-access-vl9t8\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.008044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.008837 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-run\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.009369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-sys\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.009393 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.009433 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.009479 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-lib-modules\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.009514 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.009591 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-dev\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.010308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.010770 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.010824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8afe5a8e-fa52-4d98-a549-284ff784f779-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.015428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.015380 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8afe5a8e-fa52-4d98-a549-284ff784f779-ceph\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.017431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-config-data\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.018627 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-scripts\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.019359 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8afe5a8e-fa52-4d98-a549-284ff784f779-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.029322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl9t8\" (UniqueName: \"kubernetes.io/projected/8afe5a8e-fa52-4d98-a549-284ff784f779-kube-api-access-vl9t8\") pod \"cinder-backup-0\" (UID: \"8afe5a8e-fa52-4d98-a549-284ff784f779\") " pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.117990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.264921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25","Type":"ContainerStarted","Data":"3c4d5d91f164d2fe9540fa31bca510e5be6c401d94881afb52d3e0e0f0abe17f"} Oct 08 14:31:30 crc kubenswrapper[4735]: I1008 14:31:30.655492 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 08 14:31:30 crc kubenswrapper[4735]: W1008 14:31:30.664159 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8afe5a8e_fa52_4d98_a549_284ff784f779.slice/crio-254ad3e440282f08ad8c864fcccf78ac7ad004a5645c6b2d8157d590d80da5e1 WatchSource:0}: Error finding container 254ad3e440282f08ad8c864fcccf78ac7ad004a5645c6b2d8157d590d80da5e1: Status 404 returned error can't find the container with id 254ad3e440282f08ad8c864fcccf78ac7ad004a5645c6b2d8157d590d80da5e1 Oct 08 14:31:31 crc kubenswrapper[4735]: I1008 14:31:31.274449 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25","Type":"ContainerStarted","Data":"55264cfeea149075e7399138a50aac485e1e0d24e82f8e959a8e0c413aa3cf23"} Oct 08 14:31:31 crc kubenswrapper[4735]: I1008 14:31:31.278536 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8afe5a8e-fa52-4d98-a549-284ff784f779","Type":"ContainerStarted","Data":"254ad3e440282f08ad8c864fcccf78ac7ad004a5645c6b2d8157d590d80da5e1"} Oct 08 14:31:31 crc kubenswrapper[4735]: I1008 14:31:31.729986 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.79:8776/healthcheck\": read tcp 10.217.0.2:34328->10.217.1.79:8776: read: connection reset by peer" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.075689 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.158676 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-combined-ca-bundle\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.158735 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f53fca-b19a-4f27-a0fe-6debb5e9c063-logs\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.158782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-scripts\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.158827 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data-custom\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.158877 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80f53fca-b19a-4f27-a0fe-6debb5e9c063-etc-machine-id\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.159162 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvmsl\" (UniqueName: \"kubernetes.io/projected/80f53fca-b19a-4f27-a0fe-6debb5e9c063-kube-api-access-nvmsl\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.159229 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data\") pod \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\" (UID: \"80f53fca-b19a-4f27-a0fe-6debb5e9c063\") " Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.163437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80f53fca-b19a-4f27-a0fe-6debb5e9c063-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.164286 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80f53fca-b19a-4f27-a0fe-6debb5e9c063-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.165472 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f53fca-b19a-4f27-a0fe-6debb5e9c063-logs" (OuterVolumeSpecName: "logs") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.169407 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-scripts" (OuterVolumeSpecName: "scripts") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.169663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f53fca-b19a-4f27-a0fe-6debb5e9c063-kube-api-access-nvmsl" (OuterVolumeSpecName: "kube-api-access-nvmsl") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "kube-api-access-nvmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.178606 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.244061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.259145 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data" (OuterVolumeSpecName: "config-data") pod "80f53fca-b19a-4f27-a0fe-6debb5e9c063" (UID: "80f53fca-b19a-4f27-a0fe-6debb5e9c063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.266184 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvmsl\" (UniqueName: \"kubernetes.io/projected/80f53fca-b19a-4f27-a0fe-6debb5e9c063-kube-api-access-nvmsl\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.266220 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.266233 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.266245 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80f53fca-b19a-4f27-a0fe-6debb5e9c063-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.266256 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.266266 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80f53fca-b19a-4f27-a0fe-6debb5e9c063-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.295886 4735 generic.go:334] "Generic (PLEG): container finished" podID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerID="211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915" exitCode=0 Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.296047 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.296801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80f53fca-b19a-4f27-a0fe-6debb5e9c063","Type":"ContainerDied","Data":"211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915"} Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.296840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80f53fca-b19a-4f27-a0fe-6debb5e9c063","Type":"ContainerDied","Data":"9906e6b67a275db40e2974e728c152232a09dffea9c3c04c26b1807b072d374e"} Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.296860 4735 scope.go:117] "RemoveContainer" containerID="211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.299526 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d87f7873-ab8a-4ac5-ab3f-f3613f8dec25","Type":"ContainerStarted","Data":"5f018b97f17e508df6cb6d28fb4571c35dc6abb28d96ea60eccfb44032ea4fe4"} Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.304939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8afe5a8e-fa52-4d98-a549-284ff784f779","Type":"ContainerStarted","Data":"b893f786a4bd592e108f652911c5d1220232a917454634502cedff702d541dc4"} Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.304979 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8afe5a8e-fa52-4d98-a549-284ff784f779","Type":"ContainerStarted","Data":"8f23702cb568b25d692fe297decc4fa02aa90167394a45e391fbafe6d4cc6060"} Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.317442 4735 scope.go:117] "RemoveContainer" containerID="c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.336058 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.462902202 podStartE2EDuration="3.336038982s" podCreationTimestamp="2025-10-08 14:31:29 +0000 UTC" firstStartedPulling="2025-10-08 14:31:29.999967331 +0000 UTC m=+5739.808271875" lastFinishedPulling="2025-10-08 14:31:30.873104111 +0000 UTC m=+5740.681408655" observedRunningTime="2025-10-08 14:31:32.324794294 +0000 UTC m=+5742.133098838" watchObservedRunningTime="2025-10-08 14:31:32.336038982 +0000 UTC m=+5742.144343526" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.350752 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.356628 4735 scope.go:117] "RemoveContainer" containerID="211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915" Oct 08 14:31:32 crc kubenswrapper[4735]: E1008 14:31:32.357159 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915\": container with ID starting with 211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915 not found: ID does not exist" containerID="211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.357210 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915"} err="failed to get container status \"211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915\": rpc error: code = NotFound desc = could not find container \"211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915\": container with ID starting with 211c92fea9d5cd3a8cedd9eea6c907d6271141cdc2f31de92ae00bafb6ef1915 not found: ID does not exist" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.357238 4735 scope.go:117] "RemoveContainer" containerID="c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.357869 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.4209625900000002 podStartE2EDuration="3.35783484s" podCreationTimestamp="2025-10-08 14:31:29 +0000 UTC" firstStartedPulling="2025-10-08 14:31:30.666725416 +0000 UTC m=+5740.475029960" lastFinishedPulling="2025-10-08 14:31:31.603597656 +0000 UTC m=+5741.411902210" observedRunningTime="2025-10-08 14:31:32.352007281 +0000 UTC m=+5742.160311845" watchObservedRunningTime="2025-10-08 14:31:32.35783484 +0000 UTC m=+5742.166139384" Oct 08 14:31:32 crc kubenswrapper[4735]: E1008 14:31:32.358174 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d\": container with ID starting with c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d not found: ID does not exist" containerID="c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.358208 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d"} err="failed to get container status \"c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d\": rpc error: code = NotFound desc = could not find container \"c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d\": container with ID starting with c0c9764f5cbc085ed79837d8691c7f2bd022148f0cf122b338e6ddc9c58ef57d not found: ID does not exist" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.379128 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.395537 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.403190 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:31:32 crc kubenswrapper[4735]: E1008 14:31:32.403722 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.403738 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api" Oct 08 14:31:32 crc kubenswrapper[4735]: E1008 14:31:32.403761 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api-log" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.403769 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api-log" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.404040 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api-log" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.404064 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" containerName="cinder-api" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.405385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.408579 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.424845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.470535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-scripts\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.470936 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6gj\" (UniqueName: \"kubernetes.io/projected/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-kube-api-access-qc6gj\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.470990 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-config-data\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.471049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.471132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-logs\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.471210 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-config-data-custom\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.471434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.549786 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.550111 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.554629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.573179 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-scripts\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6gj\" (UniqueName: \"kubernetes.io/projected/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-kube-api-access-qc6gj\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-config-data\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574324 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574389 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-logs\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574446 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-config-data-custom\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.574647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.575244 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.575591 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-logs\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.581054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.581317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-config-data\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.581466 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.582105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-scripts\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.586154 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-config-data-custom\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.613818 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6gj\" (UniqueName: \"kubernetes.io/projected/cab1c165-0d71-4e95-8e2a-ee9bbd5b650c-kube-api-access-qc6gj\") pod \"cinder-api-0\" (UID: \"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c\") " pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.797560 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 14:31:32 crc kubenswrapper[4735]: I1008 14:31:32.924838 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f53fca-b19a-4f27-a0fe-6debb5e9c063" path="/var/lib/kubelet/pods/80f53fca-b19a-4f27-a0fe-6debb5e9c063/volumes" Oct 08 14:31:33 crc kubenswrapper[4735]: I1008 14:31:33.244964 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 14:31:33 crc kubenswrapper[4735]: W1008 14:31:33.250194 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcab1c165_0d71_4e95_8e2a_ee9bbd5b650c.slice/crio-eed5d21e413b59e264f5bc189a3d5a5464adbc1b522b24f9fb02f89392b9c80c WatchSource:0}: Error finding container eed5d21e413b59e264f5bc189a3d5a5464adbc1b522b24f9fb02f89392b9c80c: Status 404 returned error can't find the container with id eed5d21e413b59e264f5bc189a3d5a5464adbc1b522b24f9fb02f89392b9c80c Oct 08 14:31:33 crc kubenswrapper[4735]: I1008 14:31:33.323107 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c","Type":"ContainerStarted","Data":"eed5d21e413b59e264f5bc189a3d5a5464adbc1b522b24f9fb02f89392b9c80c"} Oct 08 14:31:33 crc kubenswrapper[4735]: I1008 14:31:33.326016 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 14:31:33 crc kubenswrapper[4735]: I1008 14:31:33.327556 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 14:31:33 crc kubenswrapper[4735]: I1008 14:31:33.328707 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 14:31:34 crc kubenswrapper[4735]: I1008 14:31:34.331928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c","Type":"ContainerStarted","Data":"3198fdcd85f9cfa5e2de5d416536da897045f9db9b8641b118e47b62e62371ac"} Oct 08 14:31:34 crc kubenswrapper[4735]: I1008 14:31:34.534687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:35 crc kubenswrapper[4735]: I1008 14:31:35.119236 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 08 14:31:35 crc kubenswrapper[4735]: I1008 14:31:35.349723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cab1c165-0d71-4e95-8e2a-ee9bbd5b650c","Type":"ContainerStarted","Data":"29fc084f2e118875d425593ddf17fc2ca9f4bd77078cb14794403406428814f3"} Oct 08 14:31:35 crc kubenswrapper[4735]: I1008 14:31:35.350188 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 14:31:35 crc kubenswrapper[4735]: I1008 14:31:35.372110 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.372090068 podStartE2EDuration="3.372090068s" podCreationTimestamp="2025-10-08 14:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:35.367563982 +0000 UTC m=+5745.175868536" watchObservedRunningTime="2025-10-08 14:31:35.372090068 +0000 UTC m=+5745.180394612" Oct 08 14:31:37 crc kubenswrapper[4735]: I1008 14:31:37.536594 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 14:31:37 crc kubenswrapper[4735]: I1008 14:31:37.604606 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:38 crc kubenswrapper[4735]: I1008 14:31:38.375874 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="cinder-scheduler" containerID="cri-o://b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4" gracePeriod=30 Oct 08 14:31:38 crc kubenswrapper[4735]: I1008 14:31:38.375985 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="probe" containerID="cri-o://b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1" gracePeriod=30 Oct 08 14:31:39 crc kubenswrapper[4735]: I1008 14:31:39.390440 4735 generic.go:334] "Generic (PLEG): container finished" podID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerID="b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1" exitCode=0 Oct 08 14:31:39 crc kubenswrapper[4735]: I1008 14:31:39.390554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21ac301d-ca51-4091-a1b2-6f86dce8b9e8","Type":"ContainerDied","Data":"b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1"} Oct 08 14:31:39 crc kubenswrapper[4735]: I1008 14:31:39.746751 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.303787 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.321756 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.338452 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-etc-machine-id\") pod \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.338572 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "21ac301d-ca51-4091-a1b2-6f86dce8b9e8" (UID: "21ac301d-ca51-4091-a1b2-6f86dce8b9e8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.338932 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data\") pod \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.339014 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-combined-ca-bundle\") pod \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.339074 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-452ff\" (UniqueName: \"kubernetes.io/projected/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-kube-api-access-452ff\") pod \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.339131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-scripts\") pod \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.339196 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data-custom\") pod \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\" (UID: \"21ac301d-ca51-4091-a1b2-6f86dce8b9e8\") " Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.340511 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.368945 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "21ac301d-ca51-4091-a1b2-6f86dce8b9e8" (UID: "21ac301d-ca51-4091-a1b2-6f86dce8b9e8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.369021 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-scripts" (OuterVolumeSpecName: "scripts") pod "21ac301d-ca51-4091-a1b2-6f86dce8b9e8" (UID: "21ac301d-ca51-4091-a1b2-6f86dce8b9e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.369239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-kube-api-access-452ff" (OuterVolumeSpecName: "kube-api-access-452ff") pod "21ac301d-ca51-4091-a1b2-6f86dce8b9e8" (UID: "21ac301d-ca51-4091-a1b2-6f86dce8b9e8"). InnerVolumeSpecName "kube-api-access-452ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.410078 4735 generic.go:334] "Generic (PLEG): container finished" podID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerID="b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4" exitCode=0 Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.410120 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21ac301d-ca51-4091-a1b2-6f86dce8b9e8","Type":"ContainerDied","Data":"b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4"} Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.410146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21ac301d-ca51-4091-a1b2-6f86dce8b9e8","Type":"ContainerDied","Data":"1a4b1c8299128c4cf0ffdec3112d3cf5f144de0135744b0356c33bb5b457a359"} Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.410164 4735 scope.go:117] "RemoveContainer" containerID="b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.410305 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.434320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21ac301d-ca51-4091-a1b2-6f86dce8b9e8" (UID: "21ac301d-ca51-4091-a1b2-6f86dce8b9e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.442535 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.442569 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-452ff\" (UniqueName: \"kubernetes.io/projected/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-kube-api-access-452ff\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.442580 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.442587 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.462030 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data" (OuterVolumeSpecName: "config-data") pod "21ac301d-ca51-4091-a1b2-6f86dce8b9e8" (UID: "21ac301d-ca51-4091-a1b2-6f86dce8b9e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.503569 4735 scope.go:117] "RemoveContainer" containerID="b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.541056 4735 scope.go:117] "RemoveContainer" containerID="b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1" Oct 08 14:31:40 crc kubenswrapper[4735]: E1008 14:31:40.542939 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1\": container with ID starting with b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1 not found: ID does not exist" containerID="b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.543084 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1"} err="failed to get container status \"b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1\": rpc error: code = NotFound desc = could not find container \"b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1\": container with ID starting with b2e14f02808a10747777240f1c23db706fcce99836a6aad5b11e831ceefbbcb1 not found: ID does not exist" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.543203 4735 scope.go:117] "RemoveContainer" containerID="b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.543838 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ac301d-ca51-4091-a1b2-6f86dce8b9e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:31:40 crc kubenswrapper[4735]: E1008 14:31:40.546914 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4\": container with ID starting with b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4 not found: ID does not exist" containerID="b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.547138 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4"} err="failed to get container status \"b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4\": rpc error: code = NotFound desc = could not find container \"b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4\": container with ID starting with b86bf4e6a594d0d236584a838674a0fe4f4b2b6d8cbfe4eefa41c8e16ca178a4 not found: ID does not exist" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.765384 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.774129 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.791080 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:40 crc kubenswrapper[4735]: E1008 14:31:40.791648 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="probe" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.791670 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="probe" Oct 08 14:31:40 crc kubenswrapper[4735]: E1008 14:31:40.791684 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="cinder-scheduler" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.791691 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="cinder-scheduler" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.791889 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="cinder-scheduler" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.791931 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" containerName="probe" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.793106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.795112 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.809018 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.849158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.849217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.849292 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljhk\" (UniqueName: \"kubernetes.io/projected/b3017100-96d4-4e28-ade2-3c0d3fe5462b-kube-api-access-hljhk\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.849322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-scripts\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.849352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-config-data\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.849391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b3017100-96d4-4e28-ade2-3c0d3fe5462b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.908767 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ac301d-ca51-4091-a1b2-6f86dce8b9e8" path="/var/lib/kubelet/pods/21ac301d-ca51-4091-a1b2-6f86dce8b9e8/volumes" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljhk\" (UniqueName: \"kubernetes.io/projected/b3017100-96d4-4e28-ade2-3c0d3fe5462b-kube-api-access-hljhk\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950402 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-scripts\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950430 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-config-data\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950470 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b3017100-96d4-4e28-ade2-3c0d3fe5462b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950589 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.950789 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b3017100-96d4-4e28-ade2-3c0d3fe5462b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.954013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-scripts\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.954033 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.954339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.955352 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3017100-96d4-4e28-ade2-3c0d3fe5462b-config-data\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:40 crc kubenswrapper[4735]: I1008 14:31:40.966762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljhk\" (UniqueName: \"kubernetes.io/projected/b3017100-96d4-4e28-ade2-3c0d3fe5462b-kube-api-access-hljhk\") pod \"cinder-scheduler-0\" (UID: \"b3017100-96d4-4e28-ade2-3c0d3fe5462b\") " pod="openstack/cinder-scheduler-0" Oct 08 14:31:41 crc kubenswrapper[4735]: I1008 14:31:41.113518 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 14:31:41 crc kubenswrapper[4735]: I1008 14:31:41.590337 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 14:31:41 crc kubenswrapper[4735]: W1008 14:31:41.597207 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3017100_96d4_4e28_ade2_3c0d3fe5462b.slice/crio-0a4e76a3c79aaea840d0158b50c34bf489976b0b53db5cc521c7e1b04eedfd8a WatchSource:0}: Error finding container 0a4e76a3c79aaea840d0158b50c34bf489976b0b53db5cc521c7e1b04eedfd8a: Status 404 returned error can't find the container with id 0a4e76a3c79aaea840d0158b50c34bf489976b0b53db5cc521c7e1b04eedfd8a Oct 08 14:31:42 crc kubenswrapper[4735]: I1008 14:31:42.440706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b3017100-96d4-4e28-ade2-3c0d3fe5462b","Type":"ContainerStarted","Data":"eb3f195fc46b3f62d08a19e7b1922ca859b032070ae9876235c0174f23803f6d"} Oct 08 14:31:42 crc kubenswrapper[4735]: I1008 14:31:42.441152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b3017100-96d4-4e28-ade2-3c0d3fe5462b","Type":"ContainerStarted","Data":"0a4e76a3c79aaea840d0158b50c34bf489976b0b53db5cc521c7e1b04eedfd8a"} Oct 08 14:31:43 crc kubenswrapper[4735]: I1008 14:31:43.453124 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b3017100-96d4-4e28-ade2-3c0d3fe5462b","Type":"ContainerStarted","Data":"326e881b281dda288c5112f96d3a079f17887dba27e372b02b71d9d6926a3bd9"} Oct 08 14:31:43 crc kubenswrapper[4735]: I1008 14:31:43.487500 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.487457813 podStartE2EDuration="3.487457813s" podCreationTimestamp="2025-10-08 14:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:31:43.47718577 +0000 UTC m=+5753.285490334" watchObservedRunningTime="2025-10-08 14:31:43.487457813 +0000 UTC m=+5753.295762357" Oct 08 14:31:44 crc kubenswrapper[4735]: I1008 14:31:44.697901 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 14:31:46 crc kubenswrapper[4735]: I1008 14:31:46.114035 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 14:31:51 crc kubenswrapper[4735]: I1008 14:31:51.337164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 14:31:54 crc kubenswrapper[4735]: I1008 14:31:54.878309 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:31:54 crc kubenswrapper[4735]: I1008 14:31:54.878916 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:32:10 crc kubenswrapper[4735]: I1008 14:32:10.062934 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5j8pg"] Oct 08 14:32:10 crc kubenswrapper[4735]: I1008 14:32:10.072629 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5j8pg"] Oct 08 14:32:10 crc kubenswrapper[4735]: I1008 14:32:10.904891 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca290902-5f0a-4309-8a8e-ce058661f43b" path="/var/lib/kubelet/pods/ca290902-5f0a-4309-8a8e-ce058661f43b/volumes" Oct 08 14:32:20 crc kubenswrapper[4735]: I1008 14:32:20.036981 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-02b6-account-create-l9ft7"] Oct 08 14:32:20 crc kubenswrapper[4735]: I1008 14:32:20.045131 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-02b6-account-create-l9ft7"] Oct 08 14:32:20 crc kubenswrapper[4735]: I1008 14:32:20.913061 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a" path="/var/lib/kubelet/pods/87efdaa7-f0fd-4b41-9bd8-0a95aa9dce0a/volumes" Oct 08 14:32:24 crc kubenswrapper[4735]: I1008 14:32:24.879097 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:32:24 crc kubenswrapper[4735]: I1008 14:32:24.879374 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:32:24 crc kubenswrapper[4735]: I1008 14:32:24.879423 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:32:24 crc kubenswrapper[4735]: I1008 14:32:24.880170 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7e72bb2e03843dd08508e1aaf6943aa7ade5521224f4749387f79d3a9ac80f3"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:32:24 crc kubenswrapper[4735]: I1008 14:32:24.880229 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://d7e72bb2e03843dd08508e1aaf6943aa7ade5521224f4749387f79d3a9ac80f3" gracePeriod=600 Oct 08 14:32:25 crc kubenswrapper[4735]: I1008 14:32:25.887793 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="d7e72bb2e03843dd08508e1aaf6943aa7ade5521224f4749387f79d3a9ac80f3" exitCode=0 Oct 08 14:32:25 crc kubenswrapper[4735]: I1008 14:32:25.887854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"d7e72bb2e03843dd08508e1aaf6943aa7ade5521224f4749387f79d3a9ac80f3"} Oct 08 14:32:25 crc kubenswrapper[4735]: I1008 14:32:25.888335 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0"} Oct 08 14:32:25 crc kubenswrapper[4735]: I1008 14:32:25.888355 4735 scope.go:117] "RemoveContainer" containerID="d6c737b897aa463dba6310d9840b3f5b8e05389b1e2bc8c1a8b7b77133f4dc28" Oct 08 14:32:27 crc kubenswrapper[4735]: I1008 14:32:27.033995 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2x86g"] Oct 08 14:32:27 crc kubenswrapper[4735]: I1008 14:32:27.045501 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2x86g"] Oct 08 14:32:28 crc kubenswrapper[4735]: I1008 14:32:28.905829 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5b0450-418e-4c96-8cda-c8086a3d6981" path="/var/lib/kubelet/pods/4e5b0450-418e-4c96-8cda-c8086a3d6981/volumes" Oct 08 14:32:40 crc kubenswrapper[4735]: I1008 14:32:40.031964 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wl5lz"] Oct 08 14:32:40 crc kubenswrapper[4735]: I1008 14:32:40.041298 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wl5lz"] Oct 08 14:32:40 crc kubenswrapper[4735]: I1008 14:32:40.905138 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0714d73-4ad0-4b09-a5e6-9890133d3915" path="/var/lib/kubelet/pods/e0714d73-4ad0-4b09-a5e6-9890133d3915/volumes" Oct 08 14:33:09 crc kubenswrapper[4735]: I1008 14:33:09.022137 4735 scope.go:117] "RemoveContainer" containerID="6089c013b279a7aaeb6b15cb8b1757b367fb3cad8b0c694b7e57bcdad4ae6a58" Oct 08 14:33:09 crc kubenswrapper[4735]: I1008 14:33:09.051258 4735 scope.go:117] "RemoveContainer" containerID="c732b0515033cc45cb7cfe96ba63551437730707fcc37386391e7a4d53f6cc04" Oct 08 14:33:09 crc kubenswrapper[4735]: I1008 14:33:09.082695 4735 scope.go:117] "RemoveContainer" containerID="a854b918e0f96f331795a3a94ed44712126896647447a0e726c430bb8469b6be" Oct 08 14:33:09 crc kubenswrapper[4735]: I1008 14:33:09.139993 4735 scope.go:117] "RemoveContainer" containerID="2f7f2090dcdea4d809fea56bb2f4782fce17d783c9044840f899cc1a4468d737" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.155817 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8jxgq"] Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.157767 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.162046 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-p24xp" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.162105 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.176806 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8jxgq"] Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.190658 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-z897z"] Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.193178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.196027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvtn\" (UniqueName: \"kubernetes.io/projected/e962c965-3794-4e43-9e2d-a34b94810817-kube-api-access-4vvtn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.196117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-log-ovn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.196197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e962c965-3794-4e43-9e2d-a34b94810817-scripts\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.196263 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-run\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.196448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-run-ovn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.205095 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z897z"] Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.298853 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-log-ovn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.298909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-etc-ovs\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.298956 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e962c965-3794-4e43-9e2d-a34b94810817-scripts\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.298975 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-run\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299019 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7479271a-126c-460b-8275-f194ac2fff0f-scripts\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-run-ovn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-run\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffc5\" (UniqueName: \"kubernetes.io/projected/7479271a-126c-460b-8275-f194ac2fff0f-kube-api-access-qffc5\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-log\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299150 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-lib\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299220 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvtn\" (UniqueName: \"kubernetes.io/projected/e962c965-3794-4e43-9e2d-a34b94810817-kube-api-access-4vvtn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299233 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-log-ovn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-run-ovn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.299670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e962c965-3794-4e43-9e2d-a34b94810817-var-run\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.301090 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e962c965-3794-4e43-9e2d-a34b94810817-scripts\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.327438 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvtn\" (UniqueName: \"kubernetes.io/projected/e962c965-3794-4e43-9e2d-a34b94810817-kube-api-access-4vvtn\") pod \"ovn-controller-8jxgq\" (UID: \"e962c965-3794-4e43-9e2d-a34b94810817\") " pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.400716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7479271a-126c-460b-8275-f194ac2fff0f-scripts\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.400776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-run\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.400819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffc5\" (UniqueName: \"kubernetes.io/projected/7479271a-126c-460b-8275-f194ac2fff0f-kube-api-access-qffc5\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.400863 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-log\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.400923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-lib\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.401002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-etc-ovs\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.401045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-run\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.401132 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-log\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.401143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-var-lib\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.401197 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7479271a-126c-460b-8275-f194ac2fff0f-etc-ovs\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.403166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7479271a-126c-460b-8275-f194ac2fff0f-scripts\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.418349 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffc5\" (UniqueName: \"kubernetes.io/projected/7479271a-126c-460b-8275-f194ac2fff0f-kube-api-access-qffc5\") pod \"ovn-controller-ovs-z897z\" (UID: \"7479271a-126c-460b-8275-f194ac2fff0f\") " pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.487735 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.527199 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:33 crc kubenswrapper[4735]: I1008 14:33:33.995499 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8jxgq"] Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.402760 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z897z"] Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.536456 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-d9b2b"] Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.540825 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.544420 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.557745 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z897z" event={"ID":"7479271a-126c-460b-8275-f194ac2fff0f","Type":"ContainerStarted","Data":"9a913feccf8d3a38574dab0b2486791e0b6fdee26f2032e59f89753900276aa4"} Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.566792 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8jxgq" event={"ID":"e962c965-3794-4e43-9e2d-a34b94810817","Type":"ContainerStarted","Data":"b668585dbcb5abfc8d73500b920f098ccfb6c2598a0c28c82fe0d5b45ab5eb04"} Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.566841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8jxgq" event={"ID":"e962c965-3794-4e43-9e2d-a34b94810817","Type":"ContainerStarted","Data":"09c9db8adae375ee3de5ac5d39a1d91331ccf86fd414f258fc2a29947eba1474"} Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.568022 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8jxgq" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.574745 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d9b2b"] Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.598967 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8jxgq" podStartSLOduration=1.598946429 podStartE2EDuration="1.598946429s" podCreationTimestamp="2025-10-08 14:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:33:34.584375496 +0000 UTC m=+5864.392680040" watchObservedRunningTime="2025-10-08 14:33:34.598946429 +0000 UTC m=+5864.407250973" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.641680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242476e6-2f91-4031-aba8-9fe336a6f44a-config\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.641747 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/242476e6-2f91-4031-aba8-9fe336a6f44a-ovn-rundir\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.641834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/242476e6-2f91-4031-aba8-9fe336a6f44a-ovs-rundir\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.642035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hsv\" (UniqueName: \"kubernetes.io/projected/242476e6-2f91-4031-aba8-9fe336a6f44a-kube-api-access-l7hsv\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.743722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242476e6-2f91-4031-aba8-9fe336a6f44a-config\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.744446 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242476e6-2f91-4031-aba8-9fe336a6f44a-config\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.744572 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/242476e6-2f91-4031-aba8-9fe336a6f44a-ovn-rundir\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.744847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/242476e6-2f91-4031-aba8-9fe336a6f44a-ovn-rundir\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.744910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/242476e6-2f91-4031-aba8-9fe336a6f44a-ovs-rundir\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.745002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/242476e6-2f91-4031-aba8-9fe336a6f44a-ovs-rundir\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.745176 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hsv\" (UniqueName: \"kubernetes.io/projected/242476e6-2f91-4031-aba8-9fe336a6f44a-kube-api-access-l7hsv\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.766022 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hsv\" (UniqueName: \"kubernetes.io/projected/242476e6-2f91-4031-aba8-9fe336a6f44a-kube-api-access-l7hsv\") pod \"ovn-controller-metrics-d9b2b\" (UID: \"242476e6-2f91-4031-aba8-9fe336a6f44a\") " pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:34 crc kubenswrapper[4735]: I1008 14:33:34.879158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d9b2b" Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.370890 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d9b2b"] Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.578595 4735 generic.go:334] "Generic (PLEG): container finished" podID="7479271a-126c-460b-8275-f194ac2fff0f" containerID="94a4761b40ab36bb476b3ddb4e936f59188347fc0ea12995ed2e57688130f885" exitCode=0 Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.578699 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z897z" event={"ID":"7479271a-126c-460b-8275-f194ac2fff0f","Type":"ContainerDied","Data":"94a4761b40ab36bb476b3ddb4e936f59188347fc0ea12995ed2e57688130f885"} Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.580628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d9b2b" event={"ID":"242476e6-2f91-4031-aba8-9fe336a6f44a","Type":"ContainerStarted","Data":"3cd1acb3fca6e03fe343090eca9386bf6887036700da8365cad7d5d8ee592562"} Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.776319 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-p4mlz"] Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.779643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.789242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-p4mlz"] Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.877793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq5x9\" (UniqueName: \"kubernetes.io/projected/0b98b6a4-048b-4af1-9661-3a99bc46599e-kube-api-access-xq5x9\") pod \"octavia-db-create-p4mlz\" (UID: \"0b98b6a4-048b-4af1-9661-3a99bc46599e\") " pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:35 crc kubenswrapper[4735]: I1008 14:33:35.980611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq5x9\" (UniqueName: \"kubernetes.io/projected/0b98b6a4-048b-4af1-9661-3a99bc46599e-kube-api-access-xq5x9\") pod \"octavia-db-create-p4mlz\" (UID: \"0b98b6a4-048b-4af1-9661-3a99bc46599e\") " pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.006882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq5x9\" (UniqueName: \"kubernetes.io/projected/0b98b6a4-048b-4af1-9661-3a99bc46599e-kube-api-access-xq5x9\") pod \"octavia-db-create-p4mlz\" (UID: \"0b98b6a4-048b-4af1-9661-3a99bc46599e\") " pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.108360 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.604119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z897z" event={"ID":"7479271a-126c-460b-8275-f194ac2fff0f","Type":"ContainerStarted","Data":"5304e3ef8a01afd12b24f78c6545c4117f564e4c1d5cc3dbf5207cea84de1273"} Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.604381 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z897z" event={"ID":"7479271a-126c-460b-8275-f194ac2fff0f","Type":"ContainerStarted","Data":"d42183d9b138cdc354aa828c9a4542bdeb624f385501ff39fabb6f2dd61bbf6a"} Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.605174 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.608082 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d9b2b" event={"ID":"242476e6-2f91-4031-aba8-9fe336a6f44a","Type":"ContainerStarted","Data":"8f64bc9109db9e84b2a66d863f2a26f30f4fa827bb2365c422600f58f29018cd"} Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.609225 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-p4mlz"] Oct 08 14:33:36 crc kubenswrapper[4735]: W1008 14:33:36.619900 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b98b6a4_048b_4af1_9661_3a99bc46599e.slice/crio-a989840e552c478ebabfd2a45d191c0f21e21d83209d9c173fe8615a075bde2e WatchSource:0}: Error finding container a989840e552c478ebabfd2a45d191c0f21e21d83209d9c173fe8615a075bde2e: Status 404 returned error can't find the container with id a989840e552c478ebabfd2a45d191c0f21e21d83209d9c173fe8615a075bde2e Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.639071 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-z897z" podStartSLOduration=3.639052581 podStartE2EDuration="3.639052581s" podCreationTimestamp="2025-10-08 14:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:33:36.628316876 +0000 UTC m=+5866.436621420" watchObservedRunningTime="2025-10-08 14:33:36.639052581 +0000 UTC m=+5866.447357125" Oct 08 14:33:36 crc kubenswrapper[4735]: I1008 14:33:36.667288 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-d9b2b" podStartSLOduration=2.667265503 podStartE2EDuration="2.667265503s" podCreationTimestamp="2025-10-08 14:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:33:36.643933676 +0000 UTC m=+5866.452238220" watchObservedRunningTime="2025-10-08 14:33:36.667265503 +0000 UTC m=+5866.475570047" Oct 08 14:33:37 crc kubenswrapper[4735]: I1008 14:33:37.617939 4735 generic.go:334] "Generic (PLEG): container finished" podID="0b98b6a4-048b-4af1-9661-3a99bc46599e" containerID="3dcc0a917e17d897f3e17f6a7be8577f9e3f2be9dd1fb73345313bff9ac5117b" exitCode=0 Oct 08 14:33:37 crc kubenswrapper[4735]: I1008 14:33:37.618010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-p4mlz" event={"ID":"0b98b6a4-048b-4af1-9661-3a99bc46599e","Type":"ContainerDied","Data":"3dcc0a917e17d897f3e17f6a7be8577f9e3f2be9dd1fb73345313bff9ac5117b"} Oct 08 14:33:37 crc kubenswrapper[4735]: I1008 14:33:37.618303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-p4mlz" event={"ID":"0b98b6a4-048b-4af1-9661-3a99bc46599e","Type":"ContainerStarted","Data":"a989840e552c478ebabfd2a45d191c0f21e21d83209d9c173fe8615a075bde2e"} Oct 08 14:33:37 crc kubenswrapper[4735]: I1008 14:33:37.618536 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:33:38 crc kubenswrapper[4735]: I1008 14:33:38.963632 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:39 crc kubenswrapper[4735]: I1008 14:33:39.040197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq5x9\" (UniqueName: \"kubernetes.io/projected/0b98b6a4-048b-4af1-9661-3a99bc46599e-kube-api-access-xq5x9\") pod \"0b98b6a4-048b-4af1-9661-3a99bc46599e\" (UID: \"0b98b6a4-048b-4af1-9661-3a99bc46599e\") " Oct 08 14:33:39 crc kubenswrapper[4735]: I1008 14:33:39.045298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b98b6a4-048b-4af1-9661-3a99bc46599e-kube-api-access-xq5x9" (OuterVolumeSpecName: "kube-api-access-xq5x9") pod "0b98b6a4-048b-4af1-9661-3a99bc46599e" (UID: "0b98b6a4-048b-4af1-9661-3a99bc46599e"). InnerVolumeSpecName "kube-api-access-xq5x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:33:39 crc kubenswrapper[4735]: I1008 14:33:39.142264 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq5x9\" (UniqueName: \"kubernetes.io/projected/0b98b6a4-048b-4af1-9661-3a99bc46599e-kube-api-access-xq5x9\") on node \"crc\" DevicePath \"\"" Oct 08 14:33:39 crc kubenswrapper[4735]: I1008 14:33:39.638062 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-p4mlz" event={"ID":"0b98b6a4-048b-4af1-9661-3a99bc46599e","Type":"ContainerDied","Data":"a989840e552c478ebabfd2a45d191c0f21e21d83209d9c173fe8615a075bde2e"} Oct 08 14:33:39 crc kubenswrapper[4735]: I1008 14:33:39.638104 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a989840e552c478ebabfd2a45d191c0f21e21d83209d9c173fe8615a075bde2e" Oct 08 14:33:39 crc kubenswrapper[4735]: I1008 14:33:39.638178 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-p4mlz" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.164906 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-5fa0-account-create-m4dtb"] Oct 08 14:33:48 crc kubenswrapper[4735]: E1008 14:33:48.166044 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b98b6a4-048b-4af1-9661-3a99bc46599e" containerName="mariadb-database-create" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.166063 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b98b6a4-048b-4af1-9661-3a99bc46599e" containerName="mariadb-database-create" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.166284 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b98b6a4-048b-4af1-9661-3a99bc46599e" containerName="mariadb-database-create" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.167096 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.171766 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.176255 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-5fa0-account-create-m4dtb"] Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.317466 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmjb\" (UniqueName: \"kubernetes.io/projected/b61c0f13-2922-426f-a4e7-f88721437e91-kube-api-access-7gmjb\") pod \"octavia-5fa0-account-create-m4dtb\" (UID: \"b61c0f13-2922-426f-a4e7-f88721437e91\") " pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.419511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmjb\" (UniqueName: \"kubernetes.io/projected/b61c0f13-2922-426f-a4e7-f88721437e91-kube-api-access-7gmjb\") pod \"octavia-5fa0-account-create-m4dtb\" (UID: \"b61c0f13-2922-426f-a4e7-f88721437e91\") " pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.458499 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmjb\" (UniqueName: \"kubernetes.io/projected/b61c0f13-2922-426f-a4e7-f88721437e91-kube-api-access-7gmjb\") pod \"octavia-5fa0-account-create-m4dtb\" (UID: \"b61c0f13-2922-426f-a4e7-f88721437e91\") " pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:48 crc kubenswrapper[4735]: I1008 14:33:48.490876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:49 crc kubenswrapper[4735]: I1008 14:33:49.002701 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-5fa0-account-create-m4dtb"] Oct 08 14:33:49 crc kubenswrapper[4735]: W1008 14:33:49.006070 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61c0f13_2922_426f_a4e7_f88721437e91.slice/crio-00bd48d9e817eb207514ebf4bba0d28012a54ee29ee31f579aae1513687e3c73 WatchSource:0}: Error finding container 00bd48d9e817eb207514ebf4bba0d28012a54ee29ee31f579aae1513687e3c73: Status 404 returned error can't find the container with id 00bd48d9e817eb207514ebf4bba0d28012a54ee29ee31f579aae1513687e3c73 Oct 08 14:33:49 crc kubenswrapper[4735]: I1008 14:33:49.754301 4735 generic.go:334] "Generic (PLEG): container finished" podID="b61c0f13-2922-426f-a4e7-f88721437e91" containerID="86e5922a0d217fc9d4d9d64965ed5bc8690c63ce590f8bfd81745372811af5a0" exitCode=0 Oct 08 14:33:49 crc kubenswrapper[4735]: I1008 14:33:49.754397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5fa0-account-create-m4dtb" event={"ID":"b61c0f13-2922-426f-a4e7-f88721437e91","Type":"ContainerDied","Data":"86e5922a0d217fc9d4d9d64965ed5bc8690c63ce590f8bfd81745372811af5a0"} Oct 08 14:33:49 crc kubenswrapper[4735]: I1008 14:33:49.754608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5fa0-account-create-m4dtb" event={"ID":"b61c0f13-2922-426f-a4e7-f88721437e91","Type":"ContainerStarted","Data":"00bd48d9e817eb207514ebf4bba0d28012a54ee29ee31f579aae1513687e3c73"} Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.112466 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.279430 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmjb\" (UniqueName: \"kubernetes.io/projected/b61c0f13-2922-426f-a4e7-f88721437e91-kube-api-access-7gmjb\") pod \"b61c0f13-2922-426f-a4e7-f88721437e91\" (UID: \"b61c0f13-2922-426f-a4e7-f88721437e91\") " Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.284364 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61c0f13-2922-426f-a4e7-f88721437e91-kube-api-access-7gmjb" (OuterVolumeSpecName: "kube-api-access-7gmjb") pod "b61c0f13-2922-426f-a4e7-f88721437e91" (UID: "b61c0f13-2922-426f-a4e7-f88721437e91"). InnerVolumeSpecName "kube-api-access-7gmjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.381817 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmjb\" (UniqueName: \"kubernetes.io/projected/b61c0f13-2922-426f-a4e7-f88721437e91-kube-api-access-7gmjb\") on node \"crc\" DevicePath \"\"" Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.774226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-5fa0-account-create-m4dtb" event={"ID":"b61c0f13-2922-426f-a4e7-f88721437e91","Type":"ContainerDied","Data":"00bd48d9e817eb207514ebf4bba0d28012a54ee29ee31f579aae1513687e3c73"} Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.774607 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00bd48d9e817eb207514ebf4bba0d28012a54ee29ee31f579aae1513687e3c73" Oct 08 14:33:51 crc kubenswrapper[4735]: I1008 14:33:51.774439 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-5fa0-account-create-m4dtb" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.234663 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-g28rx"] Oct 08 14:33:54 crc kubenswrapper[4735]: E1008 14:33:54.235434 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61c0f13-2922-426f-a4e7-f88721437e91" containerName="mariadb-account-create" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.235448 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61c0f13-2922-426f-a4e7-f88721437e91" containerName="mariadb-account-create" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.235669 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61c0f13-2922-426f-a4e7-f88721437e91" containerName="mariadb-account-create" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.236380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.238876 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-g28rx"] Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.249910 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xwcm\" (UniqueName: \"kubernetes.io/projected/ea6310b4-41f4-4bbe-b12f-cad91f7e549a-kube-api-access-8xwcm\") pod \"octavia-persistence-db-create-g28rx\" (UID: \"ea6310b4-41f4-4bbe-b12f-cad91f7e549a\") " pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.353630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xwcm\" (UniqueName: \"kubernetes.io/projected/ea6310b4-41f4-4bbe-b12f-cad91f7e549a-kube-api-access-8xwcm\") pod \"octavia-persistence-db-create-g28rx\" (UID: \"ea6310b4-41f4-4bbe-b12f-cad91f7e549a\") " pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.375129 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xwcm\" (UniqueName: \"kubernetes.io/projected/ea6310b4-41f4-4bbe-b12f-cad91f7e549a-kube-api-access-8xwcm\") pod \"octavia-persistence-db-create-g28rx\" (UID: \"ea6310b4-41f4-4bbe-b12f-cad91f7e549a\") " pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.553299 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:33:54 crc kubenswrapper[4735]: I1008 14:33:54.815509 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-g28rx"] Oct 08 14:33:54 crc kubenswrapper[4735]: W1008 14:33:54.823376 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea6310b4_41f4_4bbe_b12f_cad91f7e549a.slice/crio-c12152bd59ad833937e1ffad66bed0d8eeec92474430bbda6c9c5cd7a791e600 WatchSource:0}: Error finding container c12152bd59ad833937e1ffad66bed0d8eeec92474430bbda6c9c5cd7a791e600: Status 404 returned error can't find the container with id c12152bd59ad833937e1ffad66bed0d8eeec92474430bbda6c9c5cd7a791e600 Oct 08 14:33:55 crc kubenswrapper[4735]: I1008 14:33:55.822207 4735 generic.go:334] "Generic (PLEG): container finished" podID="ea6310b4-41f4-4bbe-b12f-cad91f7e549a" containerID="31582749a417ae53c1c895566ac150b241900bbdc069db02fa3fc4230d497a72" exitCode=0 Oct 08 14:33:55 crc kubenswrapper[4735]: I1008 14:33:55.822321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-g28rx" event={"ID":"ea6310b4-41f4-4bbe-b12f-cad91f7e549a","Type":"ContainerDied","Data":"31582749a417ae53c1c895566ac150b241900bbdc069db02fa3fc4230d497a72"} Oct 08 14:33:55 crc kubenswrapper[4735]: I1008 14:33:55.822540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-g28rx" event={"ID":"ea6310b4-41f4-4bbe-b12f-cad91f7e549a","Type":"ContainerStarted","Data":"c12152bd59ad833937e1ffad66bed0d8eeec92474430bbda6c9c5cd7a791e600"} Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.200591 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.321006 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xwcm\" (UniqueName: \"kubernetes.io/projected/ea6310b4-41f4-4bbe-b12f-cad91f7e549a-kube-api-access-8xwcm\") pod \"ea6310b4-41f4-4bbe-b12f-cad91f7e549a\" (UID: \"ea6310b4-41f4-4bbe-b12f-cad91f7e549a\") " Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.330827 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6310b4-41f4-4bbe-b12f-cad91f7e549a-kube-api-access-8xwcm" (OuterVolumeSpecName: "kube-api-access-8xwcm") pod "ea6310b4-41f4-4bbe-b12f-cad91f7e549a" (UID: "ea6310b4-41f4-4bbe-b12f-cad91f7e549a"). InnerVolumeSpecName "kube-api-access-8xwcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.423044 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xwcm\" (UniqueName: \"kubernetes.io/projected/ea6310b4-41f4-4bbe-b12f-cad91f7e549a-kube-api-access-8xwcm\") on node \"crc\" DevicePath \"\"" Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.844124 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-g28rx" event={"ID":"ea6310b4-41f4-4bbe-b12f-cad91f7e549a","Type":"ContainerDied","Data":"c12152bd59ad833937e1ffad66bed0d8eeec92474430bbda6c9c5cd7a791e600"} Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.844369 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c12152bd59ad833937e1ffad66bed0d8eeec92474430bbda6c9c5cd7a791e600" Oct 08 14:33:57 crc kubenswrapper[4735]: I1008 14:33:57.844173 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g28rx" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.396882 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-ac36-account-create-6fs96"] Oct 08 14:34:05 crc kubenswrapper[4735]: E1008 14:34:05.397623 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6310b4-41f4-4bbe-b12f-cad91f7e549a" containerName="mariadb-database-create" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.397637 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6310b4-41f4-4bbe-b12f-cad91f7e549a" containerName="mariadb-database-create" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.397823 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6310b4-41f4-4bbe-b12f-cad91f7e549a" containerName="mariadb-database-create" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.398400 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.400623 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.431793 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-ac36-account-create-6fs96"] Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.478021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvjc\" (UniqueName: \"kubernetes.io/projected/48c567dd-7884-4d1b-aa0b-6f1c14704612-kube-api-access-4cvjc\") pod \"octavia-ac36-account-create-6fs96\" (UID: \"48c567dd-7884-4d1b-aa0b-6f1c14704612\") " pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.580595 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvjc\" (UniqueName: \"kubernetes.io/projected/48c567dd-7884-4d1b-aa0b-6f1c14704612-kube-api-access-4cvjc\") pod \"octavia-ac36-account-create-6fs96\" (UID: \"48c567dd-7884-4d1b-aa0b-6f1c14704612\") " pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.608784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvjc\" (UniqueName: \"kubernetes.io/projected/48c567dd-7884-4d1b-aa0b-6f1c14704612-kube-api-access-4cvjc\") pod \"octavia-ac36-account-create-6fs96\" (UID: \"48c567dd-7884-4d1b-aa0b-6f1c14704612\") " pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:05 crc kubenswrapper[4735]: I1008 14:34:05.747928 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:06 crc kubenswrapper[4735]: I1008 14:34:06.275266 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-ac36-account-create-6fs96"] Oct 08 14:34:06 crc kubenswrapper[4735]: I1008 14:34:06.939280 4735 generic.go:334] "Generic (PLEG): container finished" podID="48c567dd-7884-4d1b-aa0b-6f1c14704612" containerID="ed1432aa81a6b69b15bd099f9048535fcef2c81fd832c21ce73435b398b00d2e" exitCode=0 Oct 08 14:34:06 crc kubenswrapper[4735]: I1008 14:34:06.939323 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ac36-account-create-6fs96" event={"ID":"48c567dd-7884-4d1b-aa0b-6f1c14704612","Type":"ContainerDied","Data":"ed1432aa81a6b69b15bd099f9048535fcef2c81fd832c21ce73435b398b00d2e"} Oct 08 14:34:06 crc kubenswrapper[4735]: I1008 14:34:06.939348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ac36-account-create-6fs96" event={"ID":"48c567dd-7884-4d1b-aa0b-6f1c14704612","Type":"ContainerStarted","Data":"85c4ace795eb3a41c807444f72654d774ade26749caf3bbfcd017d74fc5994dc"} Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.384656 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.478208 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cvjc\" (UniqueName: \"kubernetes.io/projected/48c567dd-7884-4d1b-aa0b-6f1c14704612-kube-api-access-4cvjc\") pod \"48c567dd-7884-4d1b-aa0b-6f1c14704612\" (UID: \"48c567dd-7884-4d1b-aa0b-6f1c14704612\") " Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.484280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c567dd-7884-4d1b-aa0b-6f1c14704612-kube-api-access-4cvjc" (OuterVolumeSpecName: "kube-api-access-4cvjc") pod "48c567dd-7884-4d1b-aa0b-6f1c14704612" (UID: "48c567dd-7884-4d1b-aa0b-6f1c14704612"). InnerVolumeSpecName "kube-api-access-4cvjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.528567 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8jxgq" podUID="e962c965-3794-4e43-9e2d-a34b94810817" containerName="ovn-controller" probeResult="failure" output=< Oct 08 14:34:08 crc kubenswrapper[4735]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 14:34:08 crc kubenswrapper[4735]: > Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.574003 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.576328 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z897z" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.580784 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cvjc\" (UniqueName: \"kubernetes.io/projected/48c567dd-7884-4d1b-aa0b-6f1c14704612-kube-api-access-4cvjc\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.701146 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8jxgq-config-7xcxp"] Oct 08 14:34:08 crc kubenswrapper[4735]: E1008 14:34:08.701764 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c567dd-7884-4d1b-aa0b-6f1c14704612" containerName="mariadb-account-create" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.701795 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c567dd-7884-4d1b-aa0b-6f1c14704612" containerName="mariadb-account-create" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.702158 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c567dd-7884-4d1b-aa0b-6f1c14704612" containerName="mariadb-account-create" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.703232 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.710941 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8jxgq-config-7xcxp"] Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.711225 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.783590 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5v9z\" (UniqueName: \"kubernetes.io/projected/83c22748-91a9-4d05-b4dd-aa6327aaee03-kube-api-access-j5v9z\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.783640 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-scripts\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.783682 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-log-ovn\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.783756 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.783820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-additional-scripts\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.784044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run-ovn\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.886327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-scripts\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.886706 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-log-ovn\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.886865 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.886980 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-log-ovn\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.887120 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.887130 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-additional-scripts\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.887390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run-ovn\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.887574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5v9z\" (UniqueName: \"kubernetes.io/projected/83c22748-91a9-4d05-b4dd-aa6327aaee03-kube-api-access-j5v9z\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.887660 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run-ovn\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.887778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-additional-scripts\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.888573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-scripts\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.914244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5v9z\" (UniqueName: \"kubernetes.io/projected/83c22748-91a9-4d05-b4dd-aa6327aaee03-kube-api-access-j5v9z\") pod \"ovn-controller-8jxgq-config-7xcxp\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.958115 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ac36-account-create-6fs96" Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.958808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ac36-account-create-6fs96" event={"ID":"48c567dd-7884-4d1b-aa0b-6f1c14704612","Type":"ContainerDied","Data":"85c4ace795eb3a41c807444f72654d774ade26749caf3bbfcd017d74fc5994dc"} Oct 08 14:34:08 crc kubenswrapper[4735]: I1008 14:34:08.958847 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c4ace795eb3a41c807444f72654d774ade26749caf3bbfcd017d74fc5994dc" Oct 08 14:34:09 crc kubenswrapper[4735]: I1008 14:34:09.034805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:09 crc kubenswrapper[4735]: I1008 14:34:09.569736 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8jxgq-config-7xcxp"] Oct 08 14:34:09 crc kubenswrapper[4735]: I1008 14:34:09.969242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8jxgq-config-7xcxp" event={"ID":"83c22748-91a9-4d05-b4dd-aa6327aaee03","Type":"ContainerStarted","Data":"7578f4de0aee4f8f1b7587793b16ae16e3d132fb2dda07dbfe782d0bef4bc7d3"} Oct 08 14:34:09 crc kubenswrapper[4735]: I1008 14:34:09.969567 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8jxgq-config-7xcxp" event={"ID":"83c22748-91a9-4d05-b4dd-aa6327aaee03","Type":"ContainerStarted","Data":"f5ac748a7237f7bf81e12574ba9dca171010bf9b1798a48317456c7e907773eb"} Oct 08 14:34:09 crc kubenswrapper[4735]: I1008 14:34:09.990702 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8jxgq-config-7xcxp" podStartSLOduration=1.990680683 podStartE2EDuration="1.990680683s" podCreationTimestamp="2025-10-08 14:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:34:09.984803113 +0000 UTC m=+5899.793107667" watchObservedRunningTime="2025-10-08 14:34:09.990680683 +0000 UTC m=+5899.798985247" Oct 08 14:34:10 crc kubenswrapper[4735]: I1008 14:34:10.983072 4735 generic.go:334] "Generic (PLEG): container finished" podID="83c22748-91a9-4d05-b4dd-aa6327aaee03" containerID="7578f4de0aee4f8f1b7587793b16ae16e3d132fb2dda07dbfe782d0bef4bc7d3" exitCode=0 Oct 08 14:34:10 crc kubenswrapper[4735]: I1008 14:34:10.983212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8jxgq-config-7xcxp" event={"ID":"83c22748-91a9-4d05-b4dd-aa6327aaee03","Type":"ContainerDied","Data":"7578f4de0aee4f8f1b7587793b16ae16e3d132fb2dda07dbfe782d0bef4bc7d3"} Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.323738 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-854f6bb796-5qdwc"] Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.326281 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.330380 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.330736 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-782lh" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.330908 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.346133 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-854f6bb796-5qdwc"] Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.369117 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.474382 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-log-ovn\") pod \"83c22748-91a9-4d05-b4dd-aa6327aaee03\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.474453 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5v9z\" (UniqueName: \"kubernetes.io/projected/83c22748-91a9-4d05-b4dd-aa6327aaee03-kube-api-access-j5v9z\") pod \"83c22748-91a9-4d05-b4dd-aa6327aaee03\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.474615 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-scripts\") pod \"83c22748-91a9-4d05-b4dd-aa6327aaee03\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.474647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run\") pod \"83c22748-91a9-4d05-b4dd-aa6327aaee03\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.474687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run-ovn\") pod \"83c22748-91a9-4d05-b4dd-aa6327aaee03\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.474753 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-additional-scripts\") pod \"83c22748-91a9-4d05-b4dd-aa6327aaee03\" (UID: \"83c22748-91a9-4d05-b4dd-aa6327aaee03\") " Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.475053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "83c22748-91a9-4d05-b4dd-aa6327aaee03" (UID: "83c22748-91a9-4d05-b4dd-aa6327aaee03"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.475134 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run" (OuterVolumeSpecName: "var-run") pod "83c22748-91a9-4d05-b4dd-aa6327aaee03" (UID: "83c22748-91a9-4d05-b4dd-aa6327aaee03"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.475432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "83c22748-91a9-4d05-b4dd-aa6327aaee03" (UID: "83c22748-91a9-4d05-b4dd-aa6327aaee03"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.475314 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-combined-ca-bundle\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.475999 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "83c22748-91a9-4d05-b4dd-aa6327aaee03" (UID: "83c22748-91a9-4d05-b4dd-aa6327aaee03"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-config-data-merged\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-scripts" (OuterVolumeSpecName: "scripts") pod "83c22748-91a9-4d05-b4dd-aa6327aaee03" (UID: "83c22748-91a9-4d05-b4dd-aa6327aaee03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476249 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-scripts\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476399 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-config-data\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-octavia-run\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476713 4735 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476741 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476749 4735 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476758 4735 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83c22748-91a9-4d05-b4dd-aa6327aaee03-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.476766 4735 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83c22748-91a9-4d05-b4dd-aa6327aaee03-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.479972 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c22748-91a9-4d05-b4dd-aa6327aaee03-kube-api-access-j5v9z" (OuterVolumeSpecName: "kube-api-access-j5v9z") pod "83c22748-91a9-4d05-b4dd-aa6327aaee03" (UID: "83c22748-91a9-4d05-b4dd-aa6327aaee03"). InnerVolumeSpecName "kube-api-access-j5v9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.578461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-config-data-merged\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.578988 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-config-data-merged\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.579211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-scripts\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.579451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-config-data\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.579622 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-octavia-run\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.579740 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-combined-ca-bundle\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.579928 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5v9z\" (UniqueName: \"kubernetes.io/projected/83c22748-91a9-4d05-b4dd-aa6327aaee03-kube-api-access-j5v9z\") on node \"crc\" DevicePath \"\"" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.580188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-octavia-run\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.584246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-combined-ca-bundle\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.589341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-scripts\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.590825 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0667386c-3ac2-4d41-9ead-7c87d40e8b1b-config-data\") pod \"octavia-api-854f6bb796-5qdwc\" (UID: \"0667386c-3ac2-4d41-9ead-7c87d40e8b1b\") " pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:12 crc kubenswrapper[4735]: I1008 14:34:12.684989 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.024149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8jxgq-config-7xcxp" event={"ID":"83c22748-91a9-4d05-b4dd-aa6327aaee03","Type":"ContainerDied","Data":"f5ac748a7237f7bf81e12574ba9dca171010bf9b1798a48317456c7e907773eb"} Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.024571 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ac748a7237f7bf81e12574ba9dca171010bf9b1798a48317456c7e907773eb" Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.024304 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8jxgq-config-7xcxp" Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.078128 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8jxgq-config-7xcxp"] Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.089608 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8jxgq-config-7xcxp"] Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.172624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-854f6bb796-5qdwc"] Oct 08 14:34:13 crc kubenswrapper[4735]: W1008 14:34:13.178675 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0667386c_3ac2_4d41_9ead_7c87d40e8b1b.slice/crio-2d60c72bc98b8dd56a2733ad727064b1f247ef8b27d359ed419e966b98b01c13 WatchSource:0}: Error finding container 2d60c72bc98b8dd56a2733ad727064b1f247ef8b27d359ed419e966b98b01c13: Status 404 returned error can't find the container with id 2d60c72bc98b8dd56a2733ad727064b1f247ef8b27d359ed419e966b98b01c13 Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.181530 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:34:13 crc kubenswrapper[4735]: I1008 14:34:13.548931 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8jxgq" Oct 08 14:34:14 crc kubenswrapper[4735]: I1008 14:34:14.039680 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854f6bb796-5qdwc" event={"ID":"0667386c-3ac2-4d41-9ead-7c87d40e8b1b","Type":"ContainerStarted","Data":"2d60c72bc98b8dd56a2733ad727064b1f247ef8b27d359ed419e966b98b01c13"} Oct 08 14:34:14 crc kubenswrapper[4735]: I1008 14:34:14.909738 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c22748-91a9-4d05-b4dd-aa6327aaee03" path="/var/lib/kubelet/pods/83c22748-91a9-4d05-b4dd-aa6327aaee03/volumes" Oct 08 14:34:22 crc kubenswrapper[4735]: I1008 14:34:22.135512 4735 generic.go:334] "Generic (PLEG): container finished" podID="0667386c-3ac2-4d41-9ead-7c87d40e8b1b" containerID="5899ed575d0b3e13409cba8a823c7959ac743d0d5a457d7642a972b1b7de22da" exitCode=0 Oct 08 14:34:22 crc kubenswrapper[4735]: I1008 14:34:22.135634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854f6bb796-5qdwc" event={"ID":"0667386c-3ac2-4d41-9ead-7c87d40e8b1b","Type":"ContainerDied","Data":"5899ed575d0b3e13409cba8a823c7959ac743d0d5a457d7642a972b1b7de22da"} Oct 08 14:34:23 crc kubenswrapper[4735]: I1008 14:34:23.150218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854f6bb796-5qdwc" event={"ID":"0667386c-3ac2-4d41-9ead-7c87d40e8b1b","Type":"ContainerStarted","Data":"1f27b263952c7f8f6484252a65b67e05e1e34b8791e6c8cf1be1c0995a0d434b"} Oct 08 14:34:23 crc kubenswrapper[4735]: I1008 14:34:23.150774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854f6bb796-5qdwc" event={"ID":"0667386c-3ac2-4d41-9ead-7c87d40e8b1b","Type":"ContainerStarted","Data":"04283cece81ac5840bb040ed27667910ac4b1c383502bd39e10c1a70853fd273"} Oct 08 14:34:23 crc kubenswrapper[4735]: I1008 14:34:23.150840 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:23 crc kubenswrapper[4735]: I1008 14:34:23.150867 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:23 crc kubenswrapper[4735]: I1008 14:34:23.181253 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-854f6bb796-5qdwc" podStartSLOduration=2.756505624 podStartE2EDuration="11.181230191s" podCreationTimestamp="2025-10-08 14:34:12 +0000 UTC" firstStartedPulling="2025-10-08 14:34:13.181245916 +0000 UTC m=+5902.989550460" lastFinishedPulling="2025-10-08 14:34:21.605970483 +0000 UTC m=+5911.414275027" observedRunningTime="2025-10-08 14:34:23.175576956 +0000 UTC m=+5912.983881510" watchObservedRunningTime="2025-10-08 14:34:23.181230191 +0000 UTC m=+5912.989534735" Oct 08 14:34:32 crc kubenswrapper[4735]: I1008 14:34:32.395976 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:32 crc kubenswrapper[4735]: I1008 14:34:32.435361 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-854f6bb796-5qdwc" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.137292 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-zv4fl"] Oct 08 14:34:44 crc kubenswrapper[4735]: E1008 14:34:44.138267 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c22748-91a9-4d05-b4dd-aa6327aaee03" containerName="ovn-config" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.138283 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c22748-91a9-4d05-b4dd-aa6327aaee03" containerName="ovn-config" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.138500 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c22748-91a9-4d05-b4dd-aa6327aaee03" containerName="ovn-config" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.139570 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.141109 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.141329 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.141444 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.170391 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-zv4fl"] Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.266016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a639a26-f084-43c7-8a61-1f317337c3b1-scripts\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.266078 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a639a26-f084-43c7-8a61-1f317337c3b1-config-data\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.266220 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6a639a26-f084-43c7-8a61-1f317337c3b1-hm-ports\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.266289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6a639a26-f084-43c7-8a61-1f317337c3b1-config-data-merged\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.367911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6a639a26-f084-43c7-8a61-1f317337c3b1-hm-ports\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.368240 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6a639a26-f084-43c7-8a61-1f317337c3b1-config-data-merged\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.368410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a639a26-f084-43c7-8a61-1f317337c3b1-scripts\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.368549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a639a26-f084-43c7-8a61-1f317337c3b1-config-data\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.368669 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6a639a26-f084-43c7-8a61-1f317337c3b1-config-data-merged\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.369041 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6a639a26-f084-43c7-8a61-1f317337c3b1-hm-ports\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.375038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a639a26-f084-43c7-8a61-1f317337c3b1-config-data\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.375140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a639a26-f084-43c7-8a61-1f317337c3b1-scripts\") pod \"octavia-rsyslog-zv4fl\" (UID: \"6a639a26-f084-43c7-8a61-1f317337c3b1\") " pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:44 crc kubenswrapper[4735]: I1008 14:34:44.464759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.011548 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-wtg2c"] Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.014188 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.017220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-wtg2c"] Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.017878 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.026171 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-zv4fl"] Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.192264 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec1b5d17-d369-48de-91b5-5fed7a0b186e-httpd-config\") pod \"octavia-image-upload-678599687f-wtg2c\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.192439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ec1b5d17-d369-48de-91b5-5fed7a0b186e-amphora-image\") pod \"octavia-image-upload-678599687f-wtg2c\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.295001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec1b5d17-d369-48de-91b5-5fed7a0b186e-httpd-config\") pod \"octavia-image-upload-678599687f-wtg2c\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.295140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ec1b5d17-d369-48de-91b5-5fed7a0b186e-amphora-image\") pod \"octavia-image-upload-678599687f-wtg2c\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.295753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ec1b5d17-d369-48de-91b5-5fed7a0b186e-amphora-image\") pod \"octavia-image-upload-678599687f-wtg2c\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.304035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec1b5d17-d369-48de-91b5-5fed7a0b186e-httpd-config\") pod \"octavia-image-upload-678599687f-wtg2c\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.411818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.422045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-zv4fl" event={"ID":"6a639a26-f084-43c7-8a61-1f317337c3b1","Type":"ContainerStarted","Data":"44b93a0892c13b7a6c28ecf9ee0482cd23eac0d257ea32d81e2487c8bfeaf96d"} Oct 08 14:34:45 crc kubenswrapper[4735]: I1008 14:34:45.849581 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-wtg2c"] Oct 08 14:34:45 crc kubenswrapper[4735]: W1008 14:34:45.852283 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1b5d17_d369_48de_91b5_5fed7a0b186e.slice/crio-df9c526ae974ffe6f8c097633514a46a7c873dfde0ef63df47362ae9bc763af2 WatchSource:0}: Error finding container df9c526ae974ffe6f8c097633514a46a7c873dfde0ef63df47362ae9bc763af2: Status 404 returned error can't find the container with id df9c526ae974ffe6f8c097633514a46a7c873dfde0ef63df47362ae9bc763af2 Oct 08 14:34:46 crc kubenswrapper[4735]: I1008 14:34:46.437971 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-wtg2c" event={"ID":"ec1b5d17-d369-48de-91b5-5fed7a0b186e","Type":"ContainerStarted","Data":"df9c526ae974ffe6f8c097633514a46a7c873dfde0ef63df47362ae9bc763af2"} Oct 08 14:34:47 crc kubenswrapper[4735]: I1008 14:34:47.449408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-zv4fl" event={"ID":"6a639a26-f084-43c7-8a61-1f317337c3b1","Type":"ContainerStarted","Data":"a5acc2b2c1b26db155b573285ab8406011d13b842c7cd566582ec79fa033989f"} Oct 08 14:34:49 crc kubenswrapper[4735]: I1008 14:34:49.476811 4735 generic.go:334] "Generic (PLEG): container finished" podID="6a639a26-f084-43c7-8a61-1f317337c3b1" containerID="a5acc2b2c1b26db155b573285ab8406011d13b842c7cd566582ec79fa033989f" exitCode=0 Oct 08 14:34:49 crc kubenswrapper[4735]: I1008 14:34:49.477092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-zv4fl" event={"ID":"6a639a26-f084-43c7-8a61-1f317337c3b1","Type":"ContainerDied","Data":"a5acc2b2c1b26db155b573285ab8406011d13b842c7cd566582ec79fa033989f"} Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.157095 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-kdqg9"] Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.160265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.162700 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.167161 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-kdqg9"] Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.341291 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-scripts\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.341402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45f809f1-c832-4c74-babb-397dce1a02e1-config-data-merged\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.341425 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-combined-ca-bundle\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.341460 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-config-data\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.443033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-scripts\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.443107 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45f809f1-c832-4c74-babb-397dce1a02e1-config-data-merged\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.443134 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-combined-ca-bundle\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.443509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-config-data\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.443691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45f809f1-c832-4c74-babb-397dce1a02e1-config-data-merged\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.450128 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-scripts\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.450332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-combined-ca-bundle\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.450647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-config-data\") pod \"octavia-db-sync-kdqg9\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.501629 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.518109 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-wqg8h"] Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.520194 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.522252 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.522597 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.522788 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.530034 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-wqg8h"] Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.657912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-amphora-certs\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.658017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/009e57c9-9dec-4afb-b6b7-053082bfd574-hm-ports\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.658961 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/009e57c9-9dec-4afb-b6b7-053082bfd574-config-data-merged\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.659066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-scripts\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.659614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-combined-ca-bundle\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.660223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-config-data\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.762362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-scripts\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.762461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-combined-ca-bundle\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.762559 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-config-data\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.762619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-amphora-certs\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.762716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/009e57c9-9dec-4afb-b6b7-053082bfd574-hm-ports\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.762759 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/009e57c9-9dec-4afb-b6b7-053082bfd574-config-data-merged\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.763672 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/009e57c9-9dec-4afb-b6b7-053082bfd574-config-data-merged\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.764864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/009e57c9-9dec-4afb-b6b7-053082bfd574-hm-ports\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.767811 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-amphora-certs\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.770769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-config-data\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.771257 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-combined-ca-bundle\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.783205 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/009e57c9-9dec-4afb-b6b7-053082bfd574-scripts\") pod \"octavia-healthmanager-wqg8h\" (UID: \"009e57c9-9dec-4afb-b6b7-053082bfd574\") " pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:51 crc kubenswrapper[4735]: I1008 14:34:51.864819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.138461 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-q84kl"] Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.144235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.146594 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.146648 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.153740 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-q84kl"] Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.289465 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-amphora-certs\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.289824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-combined-ca-bundle\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.289934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e1d048d3-16c4-4e6f-8433-0680bff87bdc-hm-ports\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.289963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-config-data\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.290043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e1d048d3-16c4-4e6f-8433-0680bff87bdc-config-data-merged\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.290143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-scripts\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.391427 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-combined-ca-bundle\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.391503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e1d048d3-16c4-4e6f-8433-0680bff87bdc-hm-ports\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.391520 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-config-data\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.391544 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e1d048d3-16c4-4e6f-8433-0680bff87bdc-config-data-merged\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.391571 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-scripts\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.391636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-amphora-certs\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.392898 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e1d048d3-16c4-4e6f-8433-0680bff87bdc-config-data-merged\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.393725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e1d048d3-16c4-4e6f-8433-0680bff87bdc-hm-ports\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.397689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-combined-ca-bundle\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.397695 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-scripts\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.411330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-config-data\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.411863 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e1d048d3-16c4-4e6f-8433-0680bff87bdc-amphora-certs\") pod \"octavia-housekeeping-q84kl\" (UID: \"e1d048d3-16c4-4e6f-8433-0680bff87bdc\") " pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:53 crc kubenswrapper[4735]: I1008 14:34:53.497689 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:34:54 crc kubenswrapper[4735]: I1008 14:34:54.878974 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:34:54 crc kubenswrapper[4735]: I1008 14:34:54.879039 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.588694 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-l7p4s"] Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.591333 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.593140 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.595335 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.597284 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-l7p4s"] Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.662648 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-kdqg9"] Oct 08 14:34:55 crc kubenswrapper[4735]: W1008 14:34:55.668068 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f809f1_c832_4c74_babb_397dce1a02e1.slice/crio-f4757dc9692f2016068c94e2191c429aed12b118383a65c1e9a32396397f7e1f WatchSource:0}: Error finding container f4757dc9692f2016068c94e2191c429aed12b118383a65c1e9a32396397f7e1f: Status 404 returned error can't find the container with id f4757dc9692f2016068c94e2191c429aed12b118383a65c1e9a32396397f7e1f Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.730410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-scripts\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.730448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d768f095-c98f-4232-ac5d-2ecd4b479621-config-data-merged\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.730488 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-config-data\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.730593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-combined-ca-bundle\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.730697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-amphora-certs\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.730766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d768f095-c98f-4232-ac5d-2ecd4b479621-hm-ports\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.795306 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-wqg8h"] Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.832092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-scripts\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.832142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d768f095-c98f-4232-ac5d-2ecd4b479621-config-data-merged\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.832173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-config-data\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.832195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-combined-ca-bundle\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.832224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-amphora-certs\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.832250 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d768f095-c98f-4232-ac5d-2ecd4b479621-hm-ports\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.833443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d768f095-c98f-4232-ac5d-2ecd4b479621-hm-ports\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.833646 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d768f095-c98f-4232-ac5d-2ecd4b479621-config-data-merged\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.837844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-combined-ca-bundle\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.837885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-amphora-certs\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.838643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-scripts\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.839837 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d768f095-c98f-4232-ac5d-2ecd4b479621-config-data\") pod \"octavia-worker-l7p4s\" (UID: \"d768f095-c98f-4232-ac5d-2ecd4b479621\") " pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:55 crc kubenswrapper[4735]: I1008 14:34:55.934254 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-l7p4s" Oct 08 14:34:56 crc kubenswrapper[4735]: I1008 14:34:56.429259 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-q84kl"] Oct 08 14:34:56 crc kubenswrapper[4735]: I1008 14:34:56.529706 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-l7p4s"] Oct 08 14:34:56 crc kubenswrapper[4735]: I1008 14:34:56.606743 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-kdqg9" event={"ID":"45f809f1-c832-4c74-babb-397dce1a02e1","Type":"ContainerStarted","Data":"f4757dc9692f2016068c94e2191c429aed12b118383a65c1e9a32396397f7e1f"} Oct 08 14:34:56 crc kubenswrapper[4735]: I1008 14:34:56.608172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l7p4s" event={"ID":"d768f095-c98f-4232-ac5d-2ecd4b479621","Type":"ContainerStarted","Data":"d5a2afbce32f0e3945e25f1bdfe3f1d196fb5aba4b52b304e4ba3a0be444b15f"} Oct 08 14:34:56 crc kubenswrapper[4735]: I1008 14:34:56.609310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-q84kl" event={"ID":"e1d048d3-16c4-4e6f-8433-0680bff87bdc","Type":"ContainerStarted","Data":"3c4b2b8431d2efaac442ae3b816d1db8a811b5ee4f4b99da31caa497647f0cd4"} Oct 08 14:34:56 crc kubenswrapper[4735]: I1008 14:34:56.611792 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wqg8h" event={"ID":"009e57c9-9dec-4afb-b6b7-053082bfd574","Type":"ContainerStarted","Data":"7034f15fe70c1e197ec64426dbd8dc19e344ce862a89ad672ec7ed2172424a66"} Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.624713 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-wtg2c" event={"ID":"ec1b5d17-d369-48de-91b5-5fed7a0b186e","Type":"ContainerStarted","Data":"12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3"} Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.628305 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-zv4fl" event={"ID":"6a639a26-f084-43c7-8a61-1f317337c3b1","Type":"ContainerStarted","Data":"30ea01f2b01b3b8e72fcd4b03a2726cb6195254163cf3e37952f2c2f6085f785"} Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.628513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.630706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wqg8h" event={"ID":"009e57c9-9dec-4afb-b6b7-053082bfd574","Type":"ContainerStarted","Data":"16158dac91c760a92968d3f4e59349c41f94755eeffab3ff87fbe893cb2c1efe"} Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.635892 4735 generic.go:334] "Generic (PLEG): container finished" podID="45f809f1-c832-4c74-babb-397dce1a02e1" containerID="bb283f4a3d845ed4841fe92b53719f217593684919e9e9447fcf8527a515607e" exitCode=0 Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.635943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-kdqg9" event={"ID":"45f809f1-c832-4c74-babb-397dce1a02e1","Type":"ContainerDied","Data":"bb283f4a3d845ed4841fe92b53719f217593684919e9e9447fcf8527a515607e"} Oct 08 14:34:57 crc kubenswrapper[4735]: I1008 14:34:57.654695 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-zv4fl" podStartSLOduration=3.507289631 podStartE2EDuration="13.654678691s" podCreationTimestamp="2025-10-08 14:34:44 +0000 UTC" firstStartedPulling="2025-10-08 14:34:45.061613223 +0000 UTC m=+5934.869917777" lastFinishedPulling="2025-10-08 14:34:55.209002293 +0000 UTC m=+5945.017306837" observedRunningTime="2025-10-08 14:34:57.653301566 +0000 UTC m=+5947.461606110" watchObservedRunningTime="2025-10-08 14:34:57.654678691 +0000 UTC m=+5947.462983235" Oct 08 14:34:58 crc kubenswrapper[4735]: I1008 14:34:58.655624 4735 generic.go:334] "Generic (PLEG): container finished" podID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerID="12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3" exitCode=0 Oct 08 14:34:58 crc kubenswrapper[4735]: I1008 14:34:58.655805 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-wtg2c" event={"ID":"ec1b5d17-d369-48de-91b5-5fed7a0b186e","Type":"ContainerDied","Data":"12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3"} Oct 08 14:34:58 crc kubenswrapper[4735]: I1008 14:34:58.663210 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-kdqg9" event={"ID":"45f809f1-c832-4c74-babb-397dce1a02e1","Type":"ContainerStarted","Data":"6e01e96bcd030e2576793cec38d354ec3a2f7c8daa126ac0c997692a5be0f14c"} Oct 08 14:34:58 crc kubenswrapper[4735]: I1008 14:34:58.696868 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-kdqg9" podStartSLOduration=7.696848849 podStartE2EDuration="7.696848849s" podCreationTimestamp="2025-10-08 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:34:58.690796724 +0000 UTC m=+5948.499101268" watchObservedRunningTime="2025-10-08 14:34:58.696848849 +0000 UTC m=+5948.505153393" Oct 08 14:34:59 crc kubenswrapper[4735]: I1008 14:34:59.674030 4735 generic.go:334] "Generic (PLEG): container finished" podID="009e57c9-9dec-4afb-b6b7-053082bfd574" containerID="16158dac91c760a92968d3f4e59349c41f94755eeffab3ff87fbe893cb2c1efe" exitCode=0 Oct 08 14:34:59 crc kubenswrapper[4735]: I1008 14:34:59.674129 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wqg8h" event={"ID":"009e57c9-9dec-4afb-b6b7-053082bfd574","Type":"ContainerDied","Data":"16158dac91c760a92968d3f4e59349c41f94755eeffab3ff87fbe893cb2c1efe"} Oct 08 14:34:59 crc kubenswrapper[4735]: I1008 14:34:59.676280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-wtg2c" event={"ID":"ec1b5d17-d369-48de-91b5-5fed7a0b186e","Type":"ContainerStarted","Data":"ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9"} Oct 08 14:34:59 crc kubenswrapper[4735]: I1008 14:34:59.677995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l7p4s" event={"ID":"d768f095-c98f-4232-ac5d-2ecd4b479621","Type":"ContainerStarted","Data":"a0e18ea2ca73c12a30442e4ed471457ab6b5c253e86c61fe3fcbb2f4e3cf1118"} Oct 08 14:34:59 crc kubenswrapper[4735]: I1008 14:34:59.679656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-q84kl" event={"ID":"e1d048d3-16c4-4e6f-8433-0680bff87bdc","Type":"ContainerStarted","Data":"9bf26635da89cb5a446c8deaf1a95ae4bf61643d84c492040a3ca1725bfbce97"} Oct 08 14:34:59 crc kubenswrapper[4735]: I1008 14:34:59.770849 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-wtg2c" podStartSLOduration=6.185795401 podStartE2EDuration="15.77083063s" podCreationTimestamp="2025-10-08 14:34:44 +0000 UTC" firstStartedPulling="2025-10-08 14:34:45.8550193 +0000 UTC m=+5935.663323844" lastFinishedPulling="2025-10-08 14:34:55.440054529 +0000 UTC m=+5945.248359073" observedRunningTime="2025-10-08 14:34:59.732996341 +0000 UTC m=+5949.541300895" watchObservedRunningTime="2025-10-08 14:34:59.77083063 +0000 UTC m=+5949.579135174" Oct 08 14:35:00 crc kubenswrapper[4735]: I1008 14:35:00.690705 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-wqg8h" event={"ID":"009e57c9-9dec-4afb-b6b7-053082bfd574","Type":"ContainerStarted","Data":"ad662e8e930514d021b4ae586f4dc9262f4bb5f40528fff75ce38df19c7b2819"} Oct 08 14:35:00 crc kubenswrapper[4735]: I1008 14:35:00.691471 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:35:00 crc kubenswrapper[4735]: I1008 14:35:00.719827 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-wqg8h" podStartSLOduration=9.719808341 podStartE2EDuration="9.719808341s" podCreationTimestamp="2025-10-08 14:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:35:00.707156887 +0000 UTC m=+5950.515461421" watchObservedRunningTime="2025-10-08 14:35:00.719808341 +0000 UTC m=+5950.528112885" Oct 08 14:35:01 crc kubenswrapper[4735]: I1008 14:35:01.702986 4735 generic.go:334] "Generic (PLEG): container finished" podID="d768f095-c98f-4232-ac5d-2ecd4b479621" containerID="a0e18ea2ca73c12a30442e4ed471457ab6b5c253e86c61fe3fcbb2f4e3cf1118" exitCode=0 Oct 08 14:35:01 crc kubenswrapper[4735]: I1008 14:35:01.703035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l7p4s" event={"ID":"d768f095-c98f-4232-ac5d-2ecd4b479621","Type":"ContainerDied","Data":"a0e18ea2ca73c12a30442e4ed471457ab6b5c253e86c61fe3fcbb2f4e3cf1118"} Oct 08 14:35:01 crc kubenswrapper[4735]: I1008 14:35:01.707022 4735 generic.go:334] "Generic (PLEG): container finished" podID="e1d048d3-16c4-4e6f-8433-0680bff87bdc" containerID="9bf26635da89cb5a446c8deaf1a95ae4bf61643d84c492040a3ca1725bfbce97" exitCode=0 Oct 08 14:35:01 crc kubenswrapper[4735]: I1008 14:35:01.707729 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-q84kl" event={"ID":"e1d048d3-16c4-4e6f-8433-0680bff87bdc","Type":"ContainerDied","Data":"9bf26635da89cb5a446c8deaf1a95ae4bf61643d84c492040a3ca1725bfbce97"} Oct 08 14:35:03 crc kubenswrapper[4735]: I1008 14:35:03.741587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-l7p4s" event={"ID":"d768f095-c98f-4232-ac5d-2ecd4b479621","Type":"ContainerStarted","Data":"9645f5eb0fa2db1f8e7275aaa9c6b1150d88b9b96f2ae48052837ffcf803f422"} Oct 08 14:35:03 crc kubenswrapper[4735]: I1008 14:35:03.772447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-q84kl" event={"ID":"e1d048d3-16c4-4e6f-8433-0680bff87bdc","Type":"ContainerStarted","Data":"e4917612c27fb96a7cba2ece389eec3c3132a5a8d0ddfe5d67c600b5c96c56a2"} Oct 08 14:35:03 crc kubenswrapper[4735]: I1008 14:35:03.773830 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-l7p4s" podStartSLOduration=6.878296537 podStartE2EDuration="8.773812967s" podCreationTimestamp="2025-10-08 14:34:55 +0000 UTC" firstStartedPulling="2025-10-08 14:34:56.532686019 +0000 UTC m=+5946.340990563" lastFinishedPulling="2025-10-08 14:34:58.428202439 +0000 UTC m=+5948.236506993" observedRunningTime="2025-10-08 14:35:03.764340494 +0000 UTC m=+5953.572645078" watchObservedRunningTime="2025-10-08 14:35:03.773812967 +0000 UTC m=+5953.582117511" Oct 08 14:35:03 crc kubenswrapper[4735]: I1008 14:35:03.774053 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:35:03 crc kubenswrapper[4735]: I1008 14:35:03.798942 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-q84kl" podStartSLOduration=8.806889488 podStartE2EDuration="10.79892276s" podCreationTimestamp="2025-10-08 14:34:53 +0000 UTC" firstStartedPulling="2025-10-08 14:34:56.441461193 +0000 UTC m=+5946.249765737" lastFinishedPulling="2025-10-08 14:34:58.433494465 +0000 UTC m=+5948.241799009" observedRunningTime="2025-10-08 14:35:03.796654332 +0000 UTC m=+5953.604958886" watchObservedRunningTime="2025-10-08 14:35:03.79892276 +0000 UTC m=+5953.607227304" Oct 08 14:35:05 crc kubenswrapper[4735]: I1008 14:35:05.035332 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-lgl8l"] Oct 08 14:35:05 crc kubenswrapper[4735]: I1008 14:35:05.043717 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-lgl8l"] Oct 08 14:35:05 crc kubenswrapper[4735]: I1008 14:35:05.792321 4735 generic.go:334] "Generic (PLEG): container finished" podID="45f809f1-c832-4c74-babb-397dce1a02e1" containerID="6e01e96bcd030e2576793cec38d354ec3a2f7c8daa126ac0c997692a5be0f14c" exitCode=0 Oct 08 14:35:05 crc kubenswrapper[4735]: I1008 14:35:05.792529 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-kdqg9" event={"ID":"45f809f1-c832-4c74-babb-397dce1a02e1","Type":"ContainerDied","Data":"6e01e96bcd030e2576793cec38d354ec3a2f7c8daa126ac0c997692a5be0f14c"} Oct 08 14:35:06 crc kubenswrapper[4735]: I1008 14:35:06.916070 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70df4ed-68de-4ffa-b7b5-ed86b503a3f5" path="/var/lib/kubelet/pods/c70df4ed-68de-4ffa-b7b5-ed86b503a3f5/volumes" Oct 08 14:35:06 crc kubenswrapper[4735]: I1008 14:35:06.917403 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-wqg8h" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.219320 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.280597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-config-data\") pod \"45f809f1-c832-4c74-babb-397dce1a02e1\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.280755 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-combined-ca-bundle\") pod \"45f809f1-c832-4c74-babb-397dce1a02e1\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.280832 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-scripts\") pod \"45f809f1-c832-4c74-babb-397dce1a02e1\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.280993 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45f809f1-c832-4c74-babb-397dce1a02e1-config-data-merged\") pod \"45f809f1-c832-4c74-babb-397dce1a02e1\" (UID: \"45f809f1-c832-4c74-babb-397dce1a02e1\") " Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.286540 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-config-data" (OuterVolumeSpecName: "config-data") pod "45f809f1-c832-4c74-babb-397dce1a02e1" (UID: "45f809f1-c832-4c74-babb-397dce1a02e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.286570 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-scripts" (OuterVolumeSpecName: "scripts") pod "45f809f1-c832-4c74-babb-397dce1a02e1" (UID: "45f809f1-c832-4c74-babb-397dce1a02e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.308837 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f809f1-c832-4c74-babb-397dce1a02e1-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "45f809f1-c832-4c74-babb-397dce1a02e1" (UID: "45f809f1-c832-4c74-babb-397dce1a02e1"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.310735 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45f809f1-c832-4c74-babb-397dce1a02e1" (UID: "45f809f1-c832-4c74-babb-397dce1a02e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.382928 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.383223 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.383315 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45f809f1-c832-4c74-babb-397dce1a02e1-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.383388 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f809f1-c832-4c74-babb-397dce1a02e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.815100 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-kdqg9" event={"ID":"45f809f1-c832-4c74-babb-397dce1a02e1","Type":"ContainerDied","Data":"f4757dc9692f2016068c94e2191c429aed12b118383a65c1e9a32396397f7e1f"} Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.815425 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4757dc9692f2016068c94e2191c429aed12b118383a65c1e9a32396397f7e1f" Oct 08 14:35:07 crc kubenswrapper[4735]: I1008 14:35:07.815197 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-kdqg9" Oct 08 14:35:08 crc kubenswrapper[4735]: I1008 14:35:08.543181 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-q84kl" Oct 08 14:35:09 crc kubenswrapper[4735]: I1008 14:35:09.278460 4735 scope.go:117] "RemoveContainer" containerID="bd8afd9e8bc03b690278ea08d7d53f443645ebe8c540be4725b19fe324630780" Oct 08 14:35:10 crc kubenswrapper[4735]: I1008 14:35:10.934706 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-l7p4s" Oct 08 14:35:10 crc kubenswrapper[4735]: I1008 14:35:10.968390 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-l7p4s" Oct 08 14:35:14 crc kubenswrapper[4735]: I1008 14:35:14.497677 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-zv4fl" Oct 08 14:35:15 crc kubenswrapper[4735]: I1008 14:35:15.042131 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a7f8-account-create-ccjmt"] Oct 08 14:35:15 crc kubenswrapper[4735]: I1008 14:35:15.053610 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a7f8-account-create-ccjmt"] Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.112274 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vwfkp"] Oct 08 14:35:16 crc kubenswrapper[4735]: E1008 14:35:16.113241 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f809f1-c832-4c74-babb-397dce1a02e1" containerName="init" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.113258 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f809f1-c832-4c74-babb-397dce1a02e1" containerName="init" Oct 08 14:35:16 crc kubenswrapper[4735]: E1008 14:35:16.113274 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f809f1-c832-4c74-babb-397dce1a02e1" containerName="octavia-db-sync" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.113284 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f809f1-c832-4c74-babb-397dce1a02e1" containerName="octavia-db-sync" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.113555 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f809f1-c832-4c74-babb-397dce1a02e1" containerName="octavia-db-sync" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.115424 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.125259 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwfkp"] Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.278737 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-catalog-content\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.278937 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-utilities\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.279030 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvm5t\" (UniqueName: \"kubernetes.io/projected/8a137853-08fe-4a8b-b976-a08cc6660b76-kube-api-access-lvm5t\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.381403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-catalog-content\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.381629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-utilities\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.381682 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvm5t\" (UniqueName: \"kubernetes.io/projected/8a137853-08fe-4a8b-b976-a08cc6660b76-kube-api-access-lvm5t\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.382456 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-catalog-content\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.382747 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-utilities\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.420227 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvm5t\" (UniqueName: \"kubernetes.io/projected/8a137853-08fe-4a8b-b976-a08cc6660b76-kube-api-access-lvm5t\") pod \"community-operators-vwfkp\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.448897 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.906921 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d20f8ae-b94e-46da-b87a-d0f1dd60446a" path="/var/lib/kubelet/pods/5d20f8ae-b94e-46da-b87a-d0f1dd60446a/volumes" Oct 08 14:35:16 crc kubenswrapper[4735]: I1008 14:35:16.961927 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwfkp"] Oct 08 14:35:17 crc kubenswrapper[4735]: I1008 14:35:17.916239 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerID="03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c" exitCode=0 Oct 08 14:35:17 crc kubenswrapper[4735]: I1008 14:35:17.916314 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerDied","Data":"03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c"} Oct 08 14:35:17 crc kubenswrapper[4735]: I1008 14:35:17.916886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerStarted","Data":"c7af3805d70108cb24e274cd0ac20844abcafb4e16ceff90e6c1d014e96a413b"} Oct 08 14:35:18 crc kubenswrapper[4735]: I1008 14:35:18.929632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerStarted","Data":"3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1"} Oct 08 14:35:19 crc kubenswrapper[4735]: I1008 14:35:19.939433 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerID="3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1" exitCode=0 Oct 08 14:35:19 crc kubenswrapper[4735]: I1008 14:35:19.939588 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerDied","Data":"3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1"} Oct 08 14:35:20 crc kubenswrapper[4735]: I1008 14:35:20.951782 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerStarted","Data":"fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880"} Oct 08 14:35:20 crc kubenswrapper[4735]: I1008 14:35:20.972224 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vwfkp" podStartSLOduration=2.399208817 podStartE2EDuration="4.972205404s" podCreationTimestamp="2025-10-08 14:35:16 +0000 UTC" firstStartedPulling="2025-10-08 14:35:17.918427145 +0000 UTC m=+5967.726731699" lastFinishedPulling="2025-10-08 14:35:20.491423732 +0000 UTC m=+5970.299728286" observedRunningTime="2025-10-08 14:35:20.966037256 +0000 UTC m=+5970.774341810" watchObservedRunningTime="2025-10-08 14:35:20.972205404 +0000 UTC m=+5970.780509948" Oct 08 14:35:22 crc kubenswrapper[4735]: I1008 14:35:22.034948 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gdwxj"] Oct 08 14:35:22 crc kubenswrapper[4735]: I1008 14:35:22.045052 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gdwxj"] Oct 08 14:35:22 crc kubenswrapper[4735]: I1008 14:35:22.906923 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6a010f-d416-40b7-9abf-f6f5d75e57bc" path="/var/lib/kubelet/pods/fb6a010f-d416-40b7-9abf-f6f5d75e57bc/volumes" Oct 08 14:35:24 crc kubenswrapper[4735]: I1008 14:35:24.878146 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:35:24 crc kubenswrapper[4735]: I1008 14:35:24.878545 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:35:26 crc kubenswrapper[4735]: I1008 14:35:26.231104 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-wtg2c"] Oct 08 14:35:26 crc kubenswrapper[4735]: I1008 14:35:26.231721 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-678599687f-wtg2c" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerName="octavia-amphora-httpd" containerID="cri-o://ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9" gracePeriod=30 Oct 08 14:35:26 crc kubenswrapper[4735]: I1008 14:35:26.449430 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:26 crc kubenswrapper[4735]: I1008 14:35:26.449474 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:26 crc kubenswrapper[4735]: I1008 14:35:26.976752 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.013349 4735 generic.go:334] "Generic (PLEG): container finished" podID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerID="ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9" exitCode=0 Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.013396 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-wtg2c" event={"ID":"ec1b5d17-d369-48de-91b5-5fed7a0b186e","Type":"ContainerDied","Data":"ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9"} Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.013421 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-wtg2c" event={"ID":"ec1b5d17-d369-48de-91b5-5fed7a0b186e","Type":"ContainerDied","Data":"df9c526ae974ffe6f8c097633514a46a7c873dfde0ef63df47362ae9bc763af2"} Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.013437 4735 scope.go:117] "RemoveContainer" containerID="ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.013609 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-wtg2c" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.041920 4735 scope.go:117] "RemoveContainer" containerID="12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.066631 4735 scope.go:117] "RemoveContainer" containerID="ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9" Oct 08 14:35:27 crc kubenswrapper[4735]: E1008 14:35:27.067361 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9\": container with ID starting with ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9 not found: ID does not exist" containerID="ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.067417 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9"} err="failed to get container status \"ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9\": rpc error: code = NotFound desc = could not find container \"ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9\": container with ID starting with ab96a4f389f1fd1b380b032245996fb6aa2708d303ef419d8a230c24119dfdc9 not found: ID does not exist" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.067452 4735 scope.go:117] "RemoveContainer" containerID="12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3" Oct 08 14:35:27 crc kubenswrapper[4735]: E1008 14:35:27.070849 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3\": container with ID starting with 12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3 not found: ID does not exist" containerID="12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.070886 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3"} err="failed to get container status \"12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3\": rpc error: code = NotFound desc = could not find container \"12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3\": container with ID starting with 12172b7728cd3c764ba931fbfc4d09992d4cc6aac24a6ab8a4a7d8ab861f9af3 not found: ID does not exist" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.144811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec1b5d17-d369-48de-91b5-5fed7a0b186e-httpd-config\") pod \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.145056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ec1b5d17-d369-48de-91b5-5fed7a0b186e-amphora-image\") pod \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\" (UID: \"ec1b5d17-d369-48de-91b5-5fed7a0b186e\") " Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.182629 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1b5d17-d369-48de-91b5-5fed7a0b186e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ec1b5d17-d369-48de-91b5-5fed7a0b186e" (UID: "ec1b5d17-d369-48de-91b5-5fed7a0b186e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.219673 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1b5d17-d369-48de-91b5-5fed7a0b186e-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "ec1b5d17-d369-48de-91b5-5fed7a0b186e" (UID: "ec1b5d17-d369-48de-91b5-5fed7a0b186e"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.248274 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec1b5d17-d369-48de-91b5-5fed7a0b186e-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.250013 4735 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ec1b5d17-d369-48de-91b5-5fed7a0b186e-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.346632 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-wtg2c"] Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.353980 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-678599687f-wtg2c"] Oct 08 14:35:27 crc kubenswrapper[4735]: I1008 14:35:27.493771 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vwfkp" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="registry-server" probeResult="failure" output=< Oct 08 14:35:27 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:35:27 crc kubenswrapper[4735]: > Oct 08 14:35:28 crc kubenswrapper[4735]: I1008 14:35:28.907314 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" path="/var/lib/kubelet/pods/ec1b5d17-d369-48de-91b5-5fed7a0b186e/volumes" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.986018 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-6nfqr"] Oct 08 14:35:31 crc kubenswrapper[4735]: E1008 14:35:31.986970 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerName="init" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.986988 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerName="init" Oct 08 14:35:31 crc kubenswrapper[4735]: E1008 14:35:31.987025 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerName="octavia-amphora-httpd" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.987033 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerName="octavia-amphora-httpd" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.987276 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1b5d17-d369-48de-91b5-5fed7a0b186e" containerName="octavia-amphora-httpd" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.988708 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.992512 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 08 14:35:31 crc kubenswrapper[4735]: I1008 14:35:31.993975 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-6nfqr"] Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.143207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff-amphora-image\") pod \"octavia-image-upload-678599687f-6nfqr\" (UID: \"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff\") " pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.143716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff-httpd-config\") pod \"octavia-image-upload-678599687f-6nfqr\" (UID: \"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff\") " pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.246378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff-amphora-image\") pod \"octavia-image-upload-678599687f-6nfqr\" (UID: \"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff\") " pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.246445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff-httpd-config\") pod \"octavia-image-upload-678599687f-6nfqr\" (UID: \"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff\") " pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.246847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff-amphora-image\") pod \"octavia-image-upload-678599687f-6nfqr\" (UID: \"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff\") " pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.253612 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff-httpd-config\") pod \"octavia-image-upload-678599687f-6nfqr\" (UID: \"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff\") " pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.313414 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-6nfqr" Oct 08 14:35:32 crc kubenswrapper[4735]: I1008 14:35:32.787071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-6nfqr"] Oct 08 14:35:33 crc kubenswrapper[4735]: I1008 14:35:33.102808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-6nfqr" event={"ID":"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff","Type":"ContainerStarted","Data":"ad292b6f42324c52f6cd692d0f97d33b9009ea8034d9dc85ef686d065750f009"} Oct 08 14:35:35 crc kubenswrapper[4735]: I1008 14:35:35.127203 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-6nfqr" event={"ID":"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff","Type":"ContainerStarted","Data":"7855e79be5c412e27e03847a0aa8025dd9e43060d0050a53f0279688ebc9666b"} Oct 08 14:35:36 crc kubenswrapper[4735]: I1008 14:35:36.149608 4735 generic.go:334] "Generic (PLEG): container finished" podID="5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff" containerID="7855e79be5c412e27e03847a0aa8025dd9e43060d0050a53f0279688ebc9666b" exitCode=0 Oct 08 14:35:36 crc kubenswrapper[4735]: I1008 14:35:36.150062 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-6nfqr" event={"ID":"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff","Type":"ContainerDied","Data":"7855e79be5c412e27e03847a0aa8025dd9e43060d0050a53f0279688ebc9666b"} Oct 08 14:35:37 crc kubenswrapper[4735]: I1008 14:35:37.171703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-6nfqr" event={"ID":"5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff","Type":"ContainerStarted","Data":"91fd0593515bd829e06465f55d1eb51cc35f877bff157796694290eaedb8af89"} Oct 08 14:35:37 crc kubenswrapper[4735]: I1008 14:35:37.189605 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-6nfqr" podStartSLOduration=4.2584309000000005 podStartE2EDuration="6.189585252s" podCreationTimestamp="2025-10-08 14:35:31 +0000 UTC" firstStartedPulling="2025-10-08 14:35:32.809609341 +0000 UTC m=+5982.617913885" lastFinishedPulling="2025-10-08 14:35:34.740763693 +0000 UTC m=+5984.549068237" observedRunningTime="2025-10-08 14:35:37.186432881 +0000 UTC m=+5986.994737435" watchObservedRunningTime="2025-10-08 14:35:37.189585252 +0000 UTC m=+5986.997889796" Oct 08 14:35:37 crc kubenswrapper[4735]: I1008 14:35:37.502202 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vwfkp" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="registry-server" probeResult="failure" output=< Oct 08 14:35:37 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:35:37 crc kubenswrapper[4735]: > Oct 08 14:35:46 crc kubenswrapper[4735]: I1008 14:35:46.505637 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:46 crc kubenswrapper[4735]: I1008 14:35:46.559124 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:46 crc kubenswrapper[4735]: I1008 14:35:46.742848 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vwfkp"] Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.287576 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vwfkp" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="registry-server" containerID="cri-o://fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880" gracePeriod=2 Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.782110 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.882223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-catalog-content\") pod \"8a137853-08fe-4a8b-b976-a08cc6660b76\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.882438 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvm5t\" (UniqueName: \"kubernetes.io/projected/8a137853-08fe-4a8b-b976-a08cc6660b76-kube-api-access-lvm5t\") pod \"8a137853-08fe-4a8b-b976-a08cc6660b76\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.882606 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-utilities\") pod \"8a137853-08fe-4a8b-b976-a08cc6660b76\" (UID: \"8a137853-08fe-4a8b-b976-a08cc6660b76\") " Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.883449 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-utilities" (OuterVolumeSpecName: "utilities") pod "8a137853-08fe-4a8b-b976-a08cc6660b76" (UID: "8a137853-08fe-4a8b-b976-a08cc6660b76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.884012 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.887835 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a137853-08fe-4a8b-b976-a08cc6660b76-kube-api-access-lvm5t" (OuterVolumeSpecName: "kube-api-access-lvm5t") pod "8a137853-08fe-4a8b-b976-a08cc6660b76" (UID: "8a137853-08fe-4a8b-b976-a08cc6660b76"). InnerVolumeSpecName "kube-api-access-lvm5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.940185 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a137853-08fe-4a8b-b976-a08cc6660b76" (UID: "8a137853-08fe-4a8b-b976-a08cc6660b76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.985582 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvm5t\" (UniqueName: \"kubernetes.io/projected/8a137853-08fe-4a8b-b976-a08cc6660b76-kube-api-access-lvm5t\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:48 crc kubenswrapper[4735]: I1008 14:35:48.985617 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a137853-08fe-4a8b-b976-a08cc6660b76-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.315625 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerID="fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880" exitCode=0 Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.315674 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerDied","Data":"fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880"} Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.315708 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfkp" event={"ID":"8a137853-08fe-4a8b-b976-a08cc6660b76","Type":"ContainerDied","Data":"c7af3805d70108cb24e274cd0ac20844abcafb4e16ceff90e6c1d014e96a413b"} Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.315727 4735 scope.go:117] "RemoveContainer" containerID="fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.315973 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwfkp" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.344935 4735 scope.go:117] "RemoveContainer" containerID="3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.357400 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vwfkp"] Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.366113 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vwfkp"] Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.370159 4735 scope.go:117] "RemoveContainer" containerID="03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.417805 4735 scope.go:117] "RemoveContainer" containerID="fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880" Oct 08 14:35:49 crc kubenswrapper[4735]: E1008 14:35:49.418573 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880\": container with ID starting with fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880 not found: ID does not exist" containerID="fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.418611 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880"} err="failed to get container status \"fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880\": rpc error: code = NotFound desc = could not find container \"fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880\": container with ID starting with fcc698ef1dc9154f34a85271df9059a3c60f4afe647073c8d9e5618e31af0880 not found: ID does not exist" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.418637 4735 scope.go:117] "RemoveContainer" containerID="3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1" Oct 08 14:35:49 crc kubenswrapper[4735]: E1008 14:35:49.419068 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1\": container with ID starting with 3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1 not found: ID does not exist" containerID="3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.419136 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1"} err="failed to get container status \"3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1\": rpc error: code = NotFound desc = could not find container \"3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1\": container with ID starting with 3bae9346962ec45303a87945b16c3e730d834f7bff2eea1b2b2894d5b3e29ad1 not found: ID does not exist" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.419171 4735 scope.go:117] "RemoveContainer" containerID="03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c" Oct 08 14:35:49 crc kubenswrapper[4735]: E1008 14:35:49.419586 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c\": container with ID starting with 03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c not found: ID does not exist" containerID="03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c" Oct 08 14:35:49 crc kubenswrapper[4735]: I1008 14:35:49.419618 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c"} err="failed to get container status \"03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c\": rpc error: code = NotFound desc = could not find container \"03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c\": container with ID starting with 03b6b85fd7c1d47bacacf21dc3e624a725e89e1ea7000ac3c0a24573323d488c not found: ID does not exist" Oct 08 14:35:50 crc kubenswrapper[4735]: I1008 14:35:50.916959 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" path="/var/lib/kubelet/pods/8a137853-08fe-4a8b-b976-a08cc6660b76/volumes" Oct 08 14:35:53 crc kubenswrapper[4735]: I1008 14:35:53.048631 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7h8rr"] Oct 08 14:35:53 crc kubenswrapper[4735]: I1008 14:35:53.057766 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7h8rr"] Oct 08 14:35:54 crc kubenswrapper[4735]: I1008 14:35:54.879326 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:35:54 crc kubenswrapper[4735]: I1008 14:35:54.879730 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:35:54 crc kubenswrapper[4735]: I1008 14:35:54.879800 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:35:54 crc kubenswrapper[4735]: I1008 14:35:54.880712 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:35:54 crc kubenswrapper[4735]: I1008 14:35:54.880773 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" gracePeriod=600 Oct 08 14:35:54 crc kubenswrapper[4735]: I1008 14:35:54.907372 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95023bc-dd24-4f91-83d1-2e871d8c7ef3" path="/var/lib/kubelet/pods/e95023bc-dd24-4f91-83d1-2e871d8c7ef3/volumes" Oct 08 14:35:55 crc kubenswrapper[4735]: E1008 14:35:55.028747 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:35:55 crc kubenswrapper[4735]: I1008 14:35:55.379321 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" exitCode=0 Oct 08 14:35:55 crc kubenswrapper[4735]: I1008 14:35:55.379362 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0"} Oct 08 14:35:55 crc kubenswrapper[4735]: I1008 14:35:55.379394 4735 scope.go:117] "RemoveContainer" containerID="d7e72bb2e03843dd08508e1aaf6943aa7ade5521224f4749387f79d3a9ac80f3" Oct 08 14:35:55 crc kubenswrapper[4735]: I1008 14:35:55.380170 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:35:55 crc kubenswrapper[4735]: E1008 14:35:55.380521 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:36:03 crc kubenswrapper[4735]: I1008 14:36:03.030549 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-41fb-account-create-d42cb"] Oct 08 14:36:03 crc kubenswrapper[4735]: I1008 14:36:03.041614 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-41fb-account-create-d42cb"] Oct 08 14:36:04 crc kubenswrapper[4735]: I1008 14:36:04.906700 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4338f1c-35c7-401d-b01c-891688326e5e" path="/var/lib/kubelet/pods/f4338f1c-35c7-401d-b01c-891688326e5e/volumes" Oct 08 14:36:06 crc kubenswrapper[4735]: I1008 14:36:06.894876 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:36:06 crc kubenswrapper[4735]: E1008 14:36:06.895536 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.343394 4735 scope.go:117] "RemoveContainer" containerID="75526589dd25e9e9ec4baed21ad6d1a3e1107cc5edb87e014d09c97ee157b319" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.375193 4735 scope.go:117] "RemoveContainer" containerID="298137c915b8b2a1e9ee074e5de91a06c04946fe5cccb1de050e4339d69164b7" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.432042 4735 scope.go:117] "RemoveContainer" containerID="fe621d50861b9567c7bec747ef591bca2bee4db8ac640a3d80b99d26346164b4" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.455579 4735 scope.go:117] "RemoveContainer" containerID="011af5768c65232d1025dd12d7b7598b8a636dfe23290854dae0231044287aa2" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.506758 4735 scope.go:117] "RemoveContainer" containerID="0267607a538f33574467b59b6c1ee4cfcd59808229a1ac60dc6472b3c256316e" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.533792 4735 scope.go:117] "RemoveContainer" containerID="061a05e4ab595661e755871e5fb20d9078cc948dcba83a6ed3cf961ddd51ff3a" Oct 08 14:36:09 crc kubenswrapper[4735]: I1008 14:36:09.597268 4735 scope.go:117] "RemoveContainer" containerID="9fb8f0a9e183de25224e93ab0a66cf487fd7c63f33f507a14f089a22f9bd4e18" Oct 08 14:36:12 crc kubenswrapper[4735]: I1008 14:36:12.042050 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vtn5d"] Oct 08 14:36:12 crc kubenswrapper[4735]: I1008 14:36:12.051048 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vtn5d"] Oct 08 14:36:12 crc kubenswrapper[4735]: I1008 14:36:12.906290 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417cb1c4-d696-47ca-bab5-8ade27c31581" path="/var/lib/kubelet/pods/417cb1c4-d696-47ca-bab5-8ade27c31581/volumes" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.687507 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7755d99645-dbp28"] Oct 08 14:36:14 crc kubenswrapper[4735]: E1008 14:36:14.688421 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="extract-utilities" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.688434 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="extract-utilities" Oct 08 14:36:14 crc kubenswrapper[4735]: E1008 14:36:14.688449 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="registry-server" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.688454 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="registry-server" Oct 08 14:36:14 crc kubenswrapper[4735]: E1008 14:36:14.688497 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="extract-content" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.688503 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="extract-content" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.688678 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a137853-08fe-4a8b-b976-a08cc6660b76" containerName="registry-server" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.705962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.708147 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.708418 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-9cnn4" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.708706 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.708770 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.730343 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7755d99645-dbp28"] Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.783534 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.783782 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-log" containerID="cri-o://1ba2ce7a3b0d5486acc32c051e9dc2b90b774da758c77049913033aec1b40467" gracePeriod=30 Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.783952 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-httpd" containerID="cri-o://196ee201bcd7a12b0b11edb03f309cec38878c0e0166164b57faeefdd66bbfaa" gracePeriod=30 Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.803776 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-config-data\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.803914 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-scripts\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.803942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3059d038-73f6-4388-a434-9cf8043323b5-logs\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.804065 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3059d038-73f6-4388-a434-9cf8043323b5-horizon-secret-key\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.804183 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56ljm\" (UniqueName: \"kubernetes.io/projected/3059d038-73f6-4388-a434-9cf8043323b5-kube-api-access-56ljm\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.838681 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.838986 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-log" containerID="cri-o://753016faca547e220e2e186780b247b2730ccd2124f4fb68a5f03aaa08372743" gracePeriod=30 Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.840187 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-httpd" containerID="cri-o://30cdc921ca53cb001ea6e691a19df660f3a1910550fc8b9c3a020ec44842e0a1" gracePeriod=30 Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.876411 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b88cbb49c-lcpt2"] Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.878222 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.890358 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b88cbb49c-lcpt2"] Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.926550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3059d038-73f6-4388-a434-9cf8043323b5-horizon-secret-key\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.926801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56ljm\" (UniqueName: \"kubernetes.io/projected/3059d038-73f6-4388-a434-9cf8043323b5-kube-api-access-56ljm\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.926911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-config-data\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.927056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-scripts\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.927091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3059d038-73f6-4388-a434-9cf8043323b5-logs\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.927602 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3059d038-73f6-4388-a434-9cf8043323b5-logs\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.929445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-config-data\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.929957 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-scripts\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.939615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3059d038-73f6-4388-a434-9cf8043323b5-horizon-secret-key\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:14 crc kubenswrapper[4735]: I1008 14:36:14.945657 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56ljm\" (UniqueName: \"kubernetes.io/projected/3059d038-73f6-4388-a434-9cf8043323b5-kube-api-access-56ljm\") pod \"horizon-7755d99645-dbp28\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.028614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-config-data\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.028726 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-scripts\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.028747 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64hc\" (UniqueName: \"kubernetes.io/projected/a947221d-27b4-4177-8ac7-26a005f8995c-kube-api-access-s64hc\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.028793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a947221d-27b4-4177-8ac7-26a005f8995c-horizon-secret-key\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.028819 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a947221d-27b4-4177-8ac7-26a005f8995c-logs\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.045784 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.130854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-config-data\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.130973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-scripts\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.130994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64hc\" (UniqueName: \"kubernetes.io/projected/a947221d-27b4-4177-8ac7-26a005f8995c-kube-api-access-s64hc\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.131045 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a947221d-27b4-4177-8ac7-26a005f8995c-horizon-secret-key\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.131079 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a947221d-27b4-4177-8ac7-26a005f8995c-logs\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.131643 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a947221d-27b4-4177-8ac7-26a005f8995c-logs\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.131880 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-scripts\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.132355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-config-data\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.137016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a947221d-27b4-4177-8ac7-26a005f8995c-horizon-secret-key\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.153966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64hc\" (UniqueName: \"kubernetes.io/projected/a947221d-27b4-4177-8ac7-26a005f8995c-kube-api-access-s64hc\") pod \"horizon-7b88cbb49c-lcpt2\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.339538 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.346365 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755d99645-dbp28"] Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.389207 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85dff5fbdf-tjmzh"] Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.391286 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.415797 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85dff5fbdf-tjmzh"] Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.437123 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-config-data\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.437230 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-horizon-secret-key\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.437272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-scripts\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.437318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8986\" (UniqueName: \"kubernetes.io/projected/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-kube-api-access-h8986\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.437353 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-logs\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.523281 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755d99645-dbp28"] Oct 08 14:36:15 crc kubenswrapper[4735]: W1008 14:36:15.534747 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3059d038_73f6_4388_a434_9cf8043323b5.slice/crio-e6e7f494363d990afb62b8ef997b44dec8c150e26f2d8775480577c5b0efec7b WatchSource:0}: Error finding container e6e7f494363d990afb62b8ef997b44dec8c150e26f2d8775480577c5b0efec7b: Status 404 returned error can't find the container with id e6e7f494363d990afb62b8ef997b44dec8c150e26f2d8775480577c5b0efec7b Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.538991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-config-data\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.539126 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-horizon-secret-key\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.539163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-scripts\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.539198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8986\" (UniqueName: \"kubernetes.io/projected/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-kube-api-access-h8986\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.539239 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-logs\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.541537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-logs\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.541637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-config-data\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.543263 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-scripts\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.544411 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-horizon-secret-key\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.558053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8986\" (UniqueName: \"kubernetes.io/projected/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-kube-api-access-h8986\") pod \"horizon-85dff5fbdf-tjmzh\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.598543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755d99645-dbp28" event={"ID":"3059d038-73f6-4388-a434-9cf8043323b5","Type":"ContainerStarted","Data":"e6e7f494363d990afb62b8ef997b44dec8c150e26f2d8775480577c5b0efec7b"} Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.609261 4735 generic.go:334] "Generic (PLEG): container finished" podID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerID="1ba2ce7a3b0d5486acc32c051e9dc2b90b774da758c77049913033aec1b40467" exitCode=143 Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.609587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41","Type":"ContainerDied","Data":"1ba2ce7a3b0d5486acc32c051e9dc2b90b774da758c77049913033aec1b40467"} Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.615421 4735 generic.go:334] "Generic (PLEG): container finished" podID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerID="753016faca547e220e2e186780b247b2730ccd2124f4fb68a5f03aaa08372743" exitCode=143 Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.615488 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45b47def-e0a2-4cd5-9ac7-fa036a784742","Type":"ContainerDied","Data":"753016faca547e220e2e186780b247b2730ccd2124f4fb68a5f03aaa08372743"} Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.714374 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:15 crc kubenswrapper[4735]: I1008 14:36:15.892496 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b88cbb49c-lcpt2"] Oct 08 14:36:16 crc kubenswrapper[4735]: I1008 14:36:16.308176 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85dff5fbdf-tjmzh"] Oct 08 14:36:16 crc kubenswrapper[4735]: W1008 14:36:16.314679 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fbe512e_e83e_4cb7_b38b_c50d6f32c8f7.slice/crio-b56abd8ef005541dbbd9133437482f04f3480782ad9dd137dc728e052c1e4a0f WatchSource:0}: Error finding container b56abd8ef005541dbbd9133437482f04f3480782ad9dd137dc728e052c1e4a0f: Status 404 returned error can't find the container with id b56abd8ef005541dbbd9133437482f04f3480782ad9dd137dc728e052c1e4a0f Oct 08 14:36:16 crc kubenswrapper[4735]: I1008 14:36:16.628309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88cbb49c-lcpt2" event={"ID":"a947221d-27b4-4177-8ac7-26a005f8995c","Type":"ContainerStarted","Data":"9847a23056808cae3f7d1b3f770dcb92bd6b1691022c3f07c6e88b385af8153a"} Oct 08 14:36:16 crc kubenswrapper[4735]: I1008 14:36:16.630666 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85dff5fbdf-tjmzh" event={"ID":"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7","Type":"ContainerStarted","Data":"b56abd8ef005541dbbd9133437482f04f3480782ad9dd137dc728e052c1e4a0f"} Oct 08 14:36:17 crc kubenswrapper[4735]: I1008 14:36:17.968315 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.40:9292/healthcheck\": read tcp 10.217.0.2:43776->10.217.1.40:9292: read: connection reset by peer" Oct 08 14:36:17 crc kubenswrapper[4735]: I1008 14:36:17.968351 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.40:9292/healthcheck\": read tcp 10.217.0.2:43790->10.217.1.40:9292: read: connection reset by peer" Oct 08 14:36:18 crc kubenswrapper[4735]: I1008 14:36:18.664508 4735 generic.go:334] "Generic (PLEG): container finished" podID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerID="196ee201bcd7a12b0b11edb03f309cec38878c0e0166164b57faeefdd66bbfaa" exitCode=0 Oct 08 14:36:18 crc kubenswrapper[4735]: I1008 14:36:18.664577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41","Type":"ContainerDied","Data":"196ee201bcd7a12b0b11edb03f309cec38878c0e0166164b57faeefdd66bbfaa"} Oct 08 14:36:18 crc kubenswrapper[4735]: I1008 14:36:18.674615 4735 generic.go:334] "Generic (PLEG): container finished" podID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerID="30cdc921ca53cb001ea6e691a19df660f3a1910550fc8b9c3a020ec44842e0a1" exitCode=0 Oct 08 14:36:18 crc kubenswrapper[4735]: I1008 14:36:18.674673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45b47def-e0a2-4cd5-9ac7-fa036a784742","Type":"ContainerDied","Data":"30cdc921ca53cb001ea6e691a19df660f3a1910550fc8b9c3a020ec44842e0a1"} Oct 08 14:36:19 crc kubenswrapper[4735]: I1008 14:36:19.505622 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.41:9292/healthcheck\": dial tcp 10.217.1.41:9292: connect: connection refused" Oct 08 14:36:19 crc kubenswrapper[4735]: I1008 14:36:19.505923 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.41:9292/healthcheck\": dial tcp 10.217.1.41:9292: connect: connection refused" Oct 08 14:36:21 crc kubenswrapper[4735]: I1008 14:36:21.894938 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:36:21 crc kubenswrapper[4735]: E1008 14:36:21.895597 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.046342 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.200285 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-httpd-run\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.201365 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.202326 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-config-data\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.202501 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-combined-ca-bundle\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.202553 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kvpx\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-kube-api-access-6kvpx\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.202585 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-scripts\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.203315 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-logs\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.203347 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-ceph\") pod \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\" (UID: \"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.204067 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-logs" (OuterVolumeSpecName: "logs") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.204251 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.204278 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.211298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-kube-api-access-6kvpx" (OuterVolumeSpecName: "kube-api-access-6kvpx") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "kube-api-access-6kvpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.221947 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-scripts" (OuterVolumeSpecName: "scripts") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.225298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-ceph" (OuterVolumeSpecName: "ceph") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.251393 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.307869 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.307903 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kvpx\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-kube-api-access-6kvpx\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.307919 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.307932 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.310741 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-config-data" (OuterVolumeSpecName: "config-data") pod "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" (UID: "7f2c54f5-b89d-4293-86a5-f4bd5e3eff41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.409740 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.460342 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613161 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-httpd-run\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613323 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-scripts\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613376 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjsgf\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-kube-api-access-mjsgf\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613443 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-logs\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613492 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-config-data\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-combined-ca-bundle\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-ceph\") pod \"45b47def-e0a2-4cd5-9ac7-fa036a784742\" (UID: \"45b47def-e0a2-4cd5-9ac7-fa036a784742\") " Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.613981 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.614431 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.615285 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-logs" (OuterVolumeSpecName: "logs") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.617443 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-scripts" (OuterVolumeSpecName: "scripts") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.618208 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-ceph" (OuterVolumeSpecName: "ceph") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.620137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-kube-api-access-mjsgf" (OuterVolumeSpecName: "kube-api-access-mjsgf") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "kube-api-access-mjsgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.654200 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.710588 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-config-data" (OuterVolumeSpecName: "config-data") pod "45b47def-e0a2-4cd5-9ac7-fa036a784742" (UID: "45b47def-e0a2-4cd5-9ac7-fa036a784742"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.716260 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.716289 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjsgf\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-kube-api-access-mjsgf\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.716309 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45b47def-e0a2-4cd5-9ac7-fa036a784742-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.716318 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.716326 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b47def-e0a2-4cd5-9ac7-fa036a784742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.716335 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45b47def-e0a2-4cd5-9ac7-fa036a784742-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.728540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88cbb49c-lcpt2" event={"ID":"a947221d-27b4-4177-8ac7-26a005f8995c","Type":"ContainerStarted","Data":"35300fc03b3eed040e927910001a3d073d55c60d6dd7d227d421c3a8296977e2"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.728589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88cbb49c-lcpt2" event={"ID":"a947221d-27b4-4177-8ac7-26a005f8995c","Type":"ContainerStarted","Data":"ccc337960433d2a19babae59e11e2078e5f1b3b7043b599a411b1290984b0348"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.733401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85dff5fbdf-tjmzh" event={"ID":"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7","Type":"ContainerStarted","Data":"ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.733446 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85dff5fbdf-tjmzh" event={"ID":"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7","Type":"ContainerStarted","Data":"0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.737522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755d99645-dbp28" event={"ID":"3059d038-73f6-4388-a434-9cf8043323b5","Type":"ContainerStarted","Data":"ea3c674dd07a7a490244b1ba45ce7c391f1a68dab45cda4090844c5a69df20df"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.737560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755d99645-dbp28" event={"ID":"3059d038-73f6-4388-a434-9cf8043323b5","Type":"ContainerStarted","Data":"b3b1ed0ec94eb2dddd47a1086884ee8d7a3fd7f7a15bd620067b0524eecbb276"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.737672 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7755d99645-dbp28" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon-log" containerID="cri-o://b3b1ed0ec94eb2dddd47a1086884ee8d7a3fd7f7a15bd620067b0524eecbb276" gracePeriod=30 Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.737930 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7755d99645-dbp28" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon" containerID="cri-o://ea3c674dd07a7a490244b1ba45ce7c391f1a68dab45cda4090844c5a69df20df" gracePeriod=30 Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.740890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7f2c54f5-b89d-4293-86a5-f4bd5e3eff41","Type":"ContainerDied","Data":"048eac7772ed4f7112ef49e2c9571e0d5bd1029cab4c85b81379e8c70ec0d44c"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.740939 4735 scope.go:117] "RemoveContainer" containerID="196ee201bcd7a12b0b11edb03f309cec38878c0e0166164b57faeefdd66bbfaa" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.741082 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.748830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45b47def-e0a2-4cd5-9ac7-fa036a784742","Type":"ContainerDied","Data":"68602bd2d3ae69acca9771553e2489bcbaae3a33d7a5b5abaa74c86c88c6979c"} Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.748936 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.763333 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b88cbb49c-lcpt2" podStartSLOduration=2.54606842 podStartE2EDuration="9.763305627s" podCreationTimestamp="2025-10-08 14:36:14 +0000 UTC" firstStartedPulling="2025-10-08 14:36:15.930294439 +0000 UTC m=+6025.738598983" lastFinishedPulling="2025-10-08 14:36:23.147531646 +0000 UTC m=+6032.955836190" observedRunningTime="2025-10-08 14:36:23.753711002 +0000 UTC m=+6033.562015546" watchObservedRunningTime="2025-10-08 14:36:23.763305627 +0000 UTC m=+6033.571610181" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.782880 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85dff5fbdf-tjmzh" podStartSLOduration=2.016934614 podStartE2EDuration="8.782857007s" podCreationTimestamp="2025-10-08 14:36:15 +0000 UTC" firstStartedPulling="2025-10-08 14:36:16.318384886 +0000 UTC m=+6026.126689430" lastFinishedPulling="2025-10-08 14:36:23.084307289 +0000 UTC m=+6032.892611823" observedRunningTime="2025-10-08 14:36:23.774798901 +0000 UTC m=+6033.583103465" watchObservedRunningTime="2025-10-08 14:36:23.782857007 +0000 UTC m=+6033.591161551" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.800197 4735 scope.go:117] "RemoveContainer" containerID="1ba2ce7a3b0d5486acc32c051e9dc2b90b774da758c77049913033aec1b40467" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.812368 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7755d99645-dbp28" podStartSLOduration=2.288751898 podStartE2EDuration="9.812348272s" podCreationTimestamp="2025-10-08 14:36:14 +0000 UTC" firstStartedPulling="2025-10-08 14:36:15.537302976 +0000 UTC m=+6025.345607520" lastFinishedPulling="2025-10-08 14:36:23.06089935 +0000 UTC m=+6032.869203894" observedRunningTime="2025-10-08 14:36:23.798248041 +0000 UTC m=+6033.606552585" watchObservedRunningTime="2025-10-08 14:36:23.812348272 +0000 UTC m=+6033.620652816" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.839271 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.852410 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.864300 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.871713 4735 scope.go:117] "RemoveContainer" containerID="30cdc921ca53cb001ea6e691a19df660f3a1910550fc8b9c3a020ec44842e0a1" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.882889 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.911107 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: E1008 14:36:23.911663 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-log" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.911682 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-log" Oct 08 14:36:23 crc kubenswrapper[4735]: E1008 14:36:23.911713 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-httpd" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.911722 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-httpd" Oct 08 14:36:23 crc kubenswrapper[4735]: E1008 14:36:23.911734 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-httpd" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.911743 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-httpd" Oct 08 14:36:23 crc kubenswrapper[4735]: E1008 14:36:23.911789 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-log" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.911798 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-log" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.912044 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-log" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.912064 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-httpd" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.912087 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" containerName="glance-httpd" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.912097 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" containerName="glance-log" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.913284 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.916712 4735 scope.go:117] "RemoveContainer" containerID="753016faca547e220e2e186780b247b2730ccd2124f4fb68a5f03aaa08372743" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.922574 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.923252 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.923359 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-796xw" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.924190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.933868 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.936020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.947447 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:36:23 crc kubenswrapper[4735]: I1008 14:36:23.953605 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 14:36:23 crc kubenswrapper[4735]: E1008 14:36:23.989079 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f2c54f5_b89d_4293_86a5_f4bd5e3eff41.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f2c54f5_b89d_4293_86a5_f4bd5e3eff41.slice/crio-048eac7772ed4f7112ef49e2c9571e0d5bd1029cab4c85b81379e8c70ec0d44c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45b47def_e0a2_4cd5_9ac7_fa036a784742.slice\": RecentStats: unable to find data in memory cache]" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.028555 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx58r\" (UniqueName: \"kubernetes.io/projected/fd641231-7640-4c7b-8e6f-dfa24dc31650-kube-api-access-xx58r\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.028803 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.028877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.028997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dmnk\" (UniqueName: \"kubernetes.io/projected/c742b20d-b837-4af4-9bd0-61fe5b2799f0-kube-api-access-7dmnk\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029050 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd641231-7640-4c7b-8e6f-dfa24dc31650-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c742b20d-b837-4af4-9bd0-61fe5b2799f0-logs\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029173 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c742b20d-b837-4af4-9bd0-61fe5b2799f0-ceph\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd641231-7640-4c7b-8e6f-dfa24dc31650-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029258 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029306 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd641231-7640-4c7b-8e6f-dfa24dc31650-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c742b20d-b837-4af4-9bd0-61fe5b2799f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.029423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.131623 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dmnk\" (UniqueName: \"kubernetes.io/projected/c742b20d-b837-4af4-9bd0-61fe5b2799f0-kube-api-access-7dmnk\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd641231-7640-4c7b-8e6f-dfa24dc31650-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132093 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c742b20d-b837-4af4-9bd0-61fe5b2799f0-logs\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132168 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c742b20d-b837-4af4-9bd0-61fe5b2799f0-ceph\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132197 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd641231-7640-4c7b-8e6f-dfa24dc31650-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd641231-7640-4c7b-8e6f-dfa24dc31650-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132323 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c742b20d-b837-4af4-9bd0-61fe5b2799f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132389 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132469 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx58r\" (UniqueName: \"kubernetes.io/projected/fd641231-7640-4c7b-8e6f-dfa24dc31650-kube-api-access-xx58r\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.132971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd641231-7640-4c7b-8e6f-dfa24dc31650-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.133268 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd641231-7640-4c7b-8e6f-dfa24dc31650-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.133724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c742b20d-b837-4af4-9bd0-61fe5b2799f0-logs\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.137013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c742b20d-b837-4af4-9bd0-61fe5b2799f0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.140231 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.141041 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd641231-7640-4c7b-8e6f-dfa24dc31650-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.141830 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.143630 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c742b20d-b837-4af4-9bd0-61fe5b2799f0-ceph\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.144347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-scripts\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.145558 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.146671 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c742b20d-b837-4af4-9bd0-61fe5b2799f0-config-data\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.146855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd641231-7640-4c7b-8e6f-dfa24dc31650-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.155330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dmnk\" (UniqueName: \"kubernetes.io/projected/c742b20d-b837-4af4-9bd0-61fe5b2799f0-kube-api-access-7dmnk\") pod \"glance-default-external-api-0\" (UID: \"c742b20d-b837-4af4-9bd0-61fe5b2799f0\") " pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.163656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx58r\" (UniqueName: \"kubernetes.io/projected/fd641231-7640-4c7b-8e6f-dfa24dc31650-kube-api-access-xx58r\") pod \"glance-default-internal-api-0\" (UID: \"fd641231-7640-4c7b-8e6f-dfa24dc31650\") " pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.359804 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.373127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.906187 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b47def-e0a2-4cd5-9ac7-fa036a784742" path="/var/lib/kubelet/pods/45b47def-e0a2-4cd5-9ac7-fa036a784742/volumes" Oct 08 14:36:24 crc kubenswrapper[4735]: I1008 14:36:24.907156 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2c54f5-b89d-4293-86a5-f4bd5e3eff41" path="/var/lib/kubelet/pods/7f2c54f5-b89d-4293-86a5-f4bd5e3eff41/volumes" Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.036096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.051829 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.158056 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.340371 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.340424 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.715544 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.715792 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.780588 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c742b20d-b837-4af4-9bd0-61fe5b2799f0","Type":"ContainerStarted","Data":"59f394f09d33e3e26d61fcccedcb31c04753ab2be5b317fc1ef06a3646991703"} Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.793106 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fd641231-7640-4c7b-8e6f-dfa24dc31650","Type":"ContainerStarted","Data":"89eed51c79191f1332a41bc0ab5f94e8680ebe48cf63a0763749071284e2ce22"} Oct 08 14:36:25 crc kubenswrapper[4735]: I1008 14:36:25.793334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fd641231-7640-4c7b-8e6f-dfa24dc31650","Type":"ContainerStarted","Data":"9a80ff49cbfb702612049da47bf5dd4ac676df0d908a8d41c3ea81d02ec2bea4"} Oct 08 14:36:26 crc kubenswrapper[4735]: I1008 14:36:26.813186 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c742b20d-b837-4af4-9bd0-61fe5b2799f0","Type":"ContainerStarted","Data":"fd23fcfec827605de1c83f29725e78ae1f33f05a87b266ac8693894cbce19306"} Oct 08 14:36:26 crc kubenswrapper[4735]: I1008 14:36:26.814288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c742b20d-b837-4af4-9bd0-61fe5b2799f0","Type":"ContainerStarted","Data":"0057dc9c753d33144dd4252171056d0f70b9f1160a0767ff98bdf6910644154c"} Oct 08 14:36:26 crc kubenswrapper[4735]: I1008 14:36:26.817056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fd641231-7640-4c7b-8e6f-dfa24dc31650","Type":"ContainerStarted","Data":"35ddbc618c04bc59ebfe629ef461d8773c6df8564d7873272683fb31287c7a37"} Oct 08 14:36:26 crc kubenswrapper[4735]: I1008 14:36:26.840171 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.8401515489999998 podStartE2EDuration="3.840151549s" podCreationTimestamp="2025-10-08 14:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:36:26.833512329 +0000 UTC m=+6036.641816873" watchObservedRunningTime="2025-10-08 14:36:26.840151549 +0000 UTC m=+6036.648456093" Oct 08 14:36:26 crc kubenswrapper[4735]: I1008 14:36:26.860144 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.86012595 podStartE2EDuration="3.86012595s" podCreationTimestamp="2025-10-08 14:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:36:26.850493583 +0000 UTC m=+6036.658798147" watchObservedRunningTime="2025-10-08 14:36:26.86012595 +0000 UTC m=+6036.668430494" Oct 08 14:36:32 crc kubenswrapper[4735]: I1008 14:36:32.899998 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:36:32 crc kubenswrapper[4735]: E1008 14:36:32.901058 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.360925 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.361280 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.374201 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.374249 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.394045 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.419180 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.419586 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.425800 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.912538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.912575 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.912589 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 14:36:34 crc kubenswrapper[4735]: I1008 14:36:34.912599 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:35 crc kubenswrapper[4735]: I1008 14:36:35.343324 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b88cbb49c-lcpt2" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 08 14:36:35 crc kubenswrapper[4735]: I1008 14:36:35.717829 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85dff5fbdf-tjmzh" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 08 14:36:36 crc kubenswrapper[4735]: I1008 14:36:36.929072 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 14:36:36 crc kubenswrapper[4735]: I1008 14:36:36.929408 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 14:36:36 crc kubenswrapper[4735]: I1008 14:36:36.929353 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 14:36:36 crc kubenswrapper[4735]: I1008 14:36:36.929711 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 14:36:37 crc kubenswrapper[4735]: I1008 14:36:37.454406 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:37 crc kubenswrapper[4735]: I1008 14:36:37.457354 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 14:36:37 crc kubenswrapper[4735]: I1008 14:36:37.631103 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 14:36:37 crc kubenswrapper[4735]: I1008 14:36:37.804595 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 14:36:46 crc kubenswrapper[4735]: I1008 14:36:46.895920 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:36:46 crc kubenswrapper[4735]: E1008 14:36:46.898366 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:36:47 crc kubenswrapper[4735]: I1008 14:36:47.279513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:47 crc kubenswrapper[4735]: I1008 14:36:47.627071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:49 crc kubenswrapper[4735]: I1008 14:36:49.095820 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:36:49 crc kubenswrapper[4735]: I1008 14:36:49.445412 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:36:49 crc kubenswrapper[4735]: I1008 14:36:49.504510 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b88cbb49c-lcpt2"] Oct 08 14:36:50 crc kubenswrapper[4735]: I1008 14:36:50.074129 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b88cbb49c-lcpt2" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon-log" containerID="cri-o://ccc337960433d2a19babae59e11e2078e5f1b3b7043b599a411b1290984b0348" gracePeriod=30 Oct 08 14:36:50 crc kubenswrapper[4735]: I1008 14:36:50.074201 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b88cbb49c-lcpt2" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" containerID="cri-o://35300fc03b3eed040e927910001a3d073d55c60d6dd7d227d421c3a8296977e2" gracePeriod=30 Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.045458 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-xxsxt"] Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.060119 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-xxsxt"] Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.114463 4735 generic.go:334] "Generic (PLEG): container finished" podID="3059d038-73f6-4388-a434-9cf8043323b5" containerID="ea3c674dd07a7a490244b1ba45ce7c391f1a68dab45cda4090844c5a69df20df" exitCode=137 Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.114534 4735 generic.go:334] "Generic (PLEG): container finished" podID="3059d038-73f6-4388-a434-9cf8043323b5" containerID="b3b1ed0ec94eb2dddd47a1086884ee8d7a3fd7f7a15bd620067b0524eecbb276" exitCode=137 Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.114584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755d99645-dbp28" event={"ID":"3059d038-73f6-4388-a434-9cf8043323b5","Type":"ContainerDied","Data":"ea3c674dd07a7a490244b1ba45ce7c391f1a68dab45cda4090844c5a69df20df"} Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.114616 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755d99645-dbp28" event={"ID":"3059d038-73f6-4388-a434-9cf8043323b5","Type":"ContainerDied","Data":"b3b1ed0ec94eb2dddd47a1086884ee8d7a3fd7f7a15bd620067b0524eecbb276"} Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.114630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755d99645-dbp28" event={"ID":"3059d038-73f6-4388-a434-9cf8043323b5","Type":"ContainerDied","Data":"e6e7f494363d990afb62b8ef997b44dec8c150e26f2d8775480577c5b0efec7b"} Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.114646 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e7f494363d990afb62b8ef997b44dec8c150e26f2d8775480577c5b0efec7b" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.118285 4735 generic.go:334] "Generic (PLEG): container finished" podID="a947221d-27b4-4177-8ac7-26a005f8995c" containerID="35300fc03b3eed040e927910001a3d073d55c60d6dd7d227d421c3a8296977e2" exitCode=0 Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.118332 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88cbb49c-lcpt2" event={"ID":"a947221d-27b4-4177-8ac7-26a005f8995c","Type":"ContainerDied","Data":"35300fc03b3eed040e927910001a3d073d55c60d6dd7d227d421c3a8296977e2"} Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.191126 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.279695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-config-data\") pod \"3059d038-73f6-4388-a434-9cf8043323b5\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.279788 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-scripts\") pod \"3059d038-73f6-4388-a434-9cf8043323b5\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.279859 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3059d038-73f6-4388-a434-9cf8043323b5-logs\") pod \"3059d038-73f6-4388-a434-9cf8043323b5\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.279911 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56ljm\" (UniqueName: \"kubernetes.io/projected/3059d038-73f6-4388-a434-9cf8043323b5-kube-api-access-56ljm\") pod \"3059d038-73f6-4388-a434-9cf8043323b5\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.279930 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3059d038-73f6-4388-a434-9cf8043323b5-horizon-secret-key\") pod \"3059d038-73f6-4388-a434-9cf8043323b5\" (UID: \"3059d038-73f6-4388-a434-9cf8043323b5\") " Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.280739 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3059d038-73f6-4388-a434-9cf8043323b5-logs" (OuterVolumeSpecName: "logs") pod "3059d038-73f6-4388-a434-9cf8043323b5" (UID: "3059d038-73f6-4388-a434-9cf8043323b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.285807 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3059d038-73f6-4388-a434-9cf8043323b5-kube-api-access-56ljm" (OuterVolumeSpecName: "kube-api-access-56ljm") pod "3059d038-73f6-4388-a434-9cf8043323b5" (UID: "3059d038-73f6-4388-a434-9cf8043323b5"). InnerVolumeSpecName "kube-api-access-56ljm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.286276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3059d038-73f6-4388-a434-9cf8043323b5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3059d038-73f6-4388-a434-9cf8043323b5" (UID: "3059d038-73f6-4388-a434-9cf8043323b5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.304283 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-scripts" (OuterVolumeSpecName: "scripts") pod "3059d038-73f6-4388-a434-9cf8043323b5" (UID: "3059d038-73f6-4388-a434-9cf8043323b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.304321 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-config-data" (OuterVolumeSpecName: "config-data") pod "3059d038-73f6-4388-a434-9cf8043323b5" (UID: "3059d038-73f6-4388-a434-9cf8043323b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.381425 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.381463 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3059d038-73f6-4388-a434-9cf8043323b5-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.381529 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56ljm\" (UniqueName: \"kubernetes.io/projected/3059d038-73f6-4388-a434-9cf8043323b5-kube-api-access-56ljm\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.381546 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3059d038-73f6-4388-a434-9cf8043323b5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.381556 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3059d038-73f6-4388-a434-9cf8043323b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:36:54 crc kubenswrapper[4735]: I1008 14:36:54.906736 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b01c62-6479-4841-b2ce-34272ef86e2f" path="/var/lib/kubelet/pods/63b01c62-6479-4841-b2ce-34272ef86e2f/volumes" Oct 08 14:36:55 crc kubenswrapper[4735]: I1008 14:36:55.128504 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755d99645-dbp28" Oct 08 14:36:55 crc kubenswrapper[4735]: I1008 14:36:55.167478 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755d99645-dbp28"] Oct 08 14:36:55 crc kubenswrapper[4735]: I1008 14:36:55.175930 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7755d99645-dbp28"] Oct 08 14:36:55 crc kubenswrapper[4735]: I1008 14:36:55.341438 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88cbb49c-lcpt2" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 08 14:36:56 crc kubenswrapper[4735]: I1008 14:36:56.908929 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3059d038-73f6-4388-a434-9cf8043323b5" path="/var/lib/kubelet/pods/3059d038-73f6-4388-a434-9cf8043323b5/volumes" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.293904 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b46fd549c-8kjz4"] Oct 08 14:36:57 crc kubenswrapper[4735]: E1008 14:36:57.294346 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon-log" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.294369 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon-log" Oct 08 14:36:57 crc kubenswrapper[4735]: E1008 14:36:57.294411 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.294418 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.294675 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon-log" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.294709 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3059d038-73f6-4388-a434-9cf8043323b5" containerName="horizon" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.295978 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.313067 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b46fd549c-8kjz4"] Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.364603 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5f29de7-461a-4ece-b951-e8842bc684a9-scripts\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.364652 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5f29de7-461a-4ece-b951-e8842bc684a9-config-data\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.365076 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f5f29de7-461a-4ece-b951-e8842bc684a9-horizon-secret-key\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.365196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxp6h\" (UniqueName: \"kubernetes.io/projected/f5f29de7-461a-4ece-b951-e8842bc684a9-kube-api-access-dxp6h\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.365240 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5f29de7-461a-4ece-b951-e8842bc684a9-logs\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.466603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f5f29de7-461a-4ece-b951-e8842bc684a9-horizon-secret-key\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.466670 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxp6h\" (UniqueName: \"kubernetes.io/projected/f5f29de7-461a-4ece-b951-e8842bc684a9-kube-api-access-dxp6h\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.466698 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5f29de7-461a-4ece-b951-e8842bc684a9-logs\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.466767 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5f29de7-461a-4ece-b951-e8842bc684a9-scripts\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.466787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5f29de7-461a-4ece-b951-e8842bc684a9-config-data\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.467311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5f29de7-461a-4ece-b951-e8842bc684a9-logs\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.468074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5f29de7-461a-4ece-b951-e8842bc684a9-scripts\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.468217 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5f29de7-461a-4ece-b951-e8842bc684a9-config-data\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.472569 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f5f29de7-461a-4ece-b951-e8842bc684a9-horizon-secret-key\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.483343 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxp6h\" (UniqueName: \"kubernetes.io/projected/f5f29de7-461a-4ece-b951-e8842bc684a9-kube-api-access-dxp6h\") pod \"horizon-7b46fd549c-8kjz4\" (UID: \"f5f29de7-461a-4ece-b951-e8842bc684a9\") " pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:57 crc kubenswrapper[4735]: I1008 14:36:57.615296 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.101780 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b46fd549c-8kjz4"] Oct 08 14:36:58 crc kubenswrapper[4735]: W1008 14:36:58.107153 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5f29de7_461a_4ece_b951_e8842bc684a9.slice/crio-a55cd82e0f848600bfbade90761741c769b78ca07780ef20588ee5f0a4c9b678 WatchSource:0}: Error finding container a55cd82e0f848600bfbade90761741c769b78ca07780ef20588ee5f0a4c9b678: Status 404 returned error can't find the container with id a55cd82e0f848600bfbade90761741c769b78ca07780ef20588ee5f0a4c9b678 Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.161276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b46fd549c-8kjz4" event={"ID":"f5f29de7-461a-4ece-b951-e8842bc684a9","Type":"ContainerStarted","Data":"a55cd82e0f848600bfbade90761741c769b78ca07780ef20588ee5f0a4c9b678"} Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.657279 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-pz79f"] Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.658768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pz79f" Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.674442 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pz79f"] Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.695256 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlgn\" (UniqueName: \"kubernetes.io/projected/f6a1cc9c-ece5-4ac2-b809-31be10f2828e-kube-api-access-wmlgn\") pod \"heat-db-create-pz79f\" (UID: \"f6a1cc9c-ece5-4ac2-b809-31be10f2828e\") " pod="openstack/heat-db-create-pz79f" Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.797581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlgn\" (UniqueName: \"kubernetes.io/projected/f6a1cc9c-ece5-4ac2-b809-31be10f2828e-kube-api-access-wmlgn\") pod \"heat-db-create-pz79f\" (UID: \"f6a1cc9c-ece5-4ac2-b809-31be10f2828e\") " pod="openstack/heat-db-create-pz79f" Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.814939 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlgn\" (UniqueName: \"kubernetes.io/projected/f6a1cc9c-ece5-4ac2-b809-31be10f2828e-kube-api-access-wmlgn\") pod \"heat-db-create-pz79f\" (UID: \"f6a1cc9c-ece5-4ac2-b809-31be10f2828e\") " pod="openstack/heat-db-create-pz79f" Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.894404 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:36:58 crc kubenswrapper[4735]: E1008 14:36:58.898377 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:36:58 crc kubenswrapper[4735]: I1008 14:36:58.980183 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pz79f" Oct 08 14:36:59 crc kubenswrapper[4735]: I1008 14:36:59.199432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b46fd549c-8kjz4" event={"ID":"f5f29de7-461a-4ece-b951-e8842bc684a9","Type":"ContainerStarted","Data":"2a60d9b4783ae75a2374765579024e06d824e06c3d96ebde3fae21d45a9acf48"} Oct 08 14:36:59 crc kubenswrapper[4735]: I1008 14:36:59.199522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b46fd549c-8kjz4" event={"ID":"f5f29de7-461a-4ece-b951-e8842bc684a9","Type":"ContainerStarted","Data":"5db5862fa9478d981347fb5e2b5a12fb7e673cdb8620a9c949f89638c616b9d2"} Oct 08 14:36:59 crc kubenswrapper[4735]: I1008 14:36:59.242153 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b46fd549c-8kjz4" podStartSLOduration=2.242127128 podStartE2EDuration="2.242127128s" podCreationTimestamp="2025-10-08 14:36:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:36:59.232716627 +0000 UTC m=+6069.041021171" watchObservedRunningTime="2025-10-08 14:36:59.242127128 +0000 UTC m=+6069.050431672" Oct 08 14:36:59 crc kubenswrapper[4735]: I1008 14:36:59.466548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pz79f"] Oct 08 14:36:59 crc kubenswrapper[4735]: W1008 14:36:59.473810 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a1cc9c_ece5_4ac2_b809_31be10f2828e.slice/crio-fa9123ba14e3752ba00d306ed48967655cab9795a7e868faa55d087a196f5ad9 WatchSource:0}: Error finding container fa9123ba14e3752ba00d306ed48967655cab9795a7e868faa55d087a196f5ad9: Status 404 returned error can't find the container with id fa9123ba14e3752ba00d306ed48967655cab9795a7e868faa55d087a196f5ad9 Oct 08 14:37:00 crc kubenswrapper[4735]: I1008 14:37:00.213169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pz79f" event={"ID":"f6a1cc9c-ece5-4ac2-b809-31be10f2828e","Type":"ContainerStarted","Data":"857e21b3a89ae6fda3cbf59889d3327a6d657b92b0a11539350cbc79efe82fdf"} Oct 08 14:37:00 crc kubenswrapper[4735]: I1008 14:37:00.213639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pz79f" event={"ID":"f6a1cc9c-ece5-4ac2-b809-31be10f2828e","Type":"ContainerStarted","Data":"fa9123ba14e3752ba00d306ed48967655cab9795a7e868faa55d087a196f5ad9"} Oct 08 14:37:00 crc kubenswrapper[4735]: I1008 14:37:00.241636 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-pz79f" podStartSLOduration=2.241612474 podStartE2EDuration="2.241612474s" podCreationTimestamp="2025-10-08 14:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:37:00.234152123 +0000 UTC m=+6070.042456667" watchObservedRunningTime="2025-10-08 14:37:00.241612474 +0000 UTC m=+6070.049917018" Oct 08 14:37:01 crc kubenswrapper[4735]: I1008 14:37:01.222563 4735 generic.go:334] "Generic (PLEG): container finished" podID="f6a1cc9c-ece5-4ac2-b809-31be10f2828e" containerID="857e21b3a89ae6fda3cbf59889d3327a6d657b92b0a11539350cbc79efe82fdf" exitCode=0 Oct 08 14:37:01 crc kubenswrapper[4735]: I1008 14:37:01.222624 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pz79f" event={"ID":"f6a1cc9c-ece5-4ac2-b809-31be10f2828e","Type":"ContainerDied","Data":"857e21b3a89ae6fda3cbf59889d3327a6d657b92b0a11539350cbc79efe82fdf"} Oct 08 14:37:02 crc kubenswrapper[4735]: I1008 14:37:02.657505 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pz79f" Oct 08 14:37:02 crc kubenswrapper[4735]: I1008 14:37:02.688078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlgn\" (UniqueName: \"kubernetes.io/projected/f6a1cc9c-ece5-4ac2-b809-31be10f2828e-kube-api-access-wmlgn\") pod \"f6a1cc9c-ece5-4ac2-b809-31be10f2828e\" (UID: \"f6a1cc9c-ece5-4ac2-b809-31be10f2828e\") " Oct 08 14:37:02 crc kubenswrapper[4735]: I1008 14:37:02.696753 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a1cc9c-ece5-4ac2-b809-31be10f2828e-kube-api-access-wmlgn" (OuterVolumeSpecName: "kube-api-access-wmlgn") pod "f6a1cc9c-ece5-4ac2-b809-31be10f2828e" (UID: "f6a1cc9c-ece5-4ac2-b809-31be10f2828e"). InnerVolumeSpecName "kube-api-access-wmlgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:37:02 crc kubenswrapper[4735]: I1008 14:37:02.791003 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlgn\" (UniqueName: \"kubernetes.io/projected/f6a1cc9c-ece5-4ac2-b809-31be10f2828e-kube-api-access-wmlgn\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:03 crc kubenswrapper[4735]: I1008 14:37:03.239652 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pz79f" event={"ID":"f6a1cc9c-ece5-4ac2-b809-31be10f2828e","Type":"ContainerDied","Data":"fa9123ba14e3752ba00d306ed48967655cab9795a7e868faa55d087a196f5ad9"} Oct 08 14:37:03 crc kubenswrapper[4735]: I1008 14:37:03.240013 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa9123ba14e3752ba00d306ed48967655cab9795a7e868faa55d087a196f5ad9" Oct 08 14:37:03 crc kubenswrapper[4735]: I1008 14:37:03.239688 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pz79f" Oct 08 14:37:04 crc kubenswrapper[4735]: I1008 14:37:04.043765 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f91b-account-create-qqfhb"] Oct 08 14:37:04 crc kubenswrapper[4735]: I1008 14:37:04.060267 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f91b-account-create-qqfhb"] Oct 08 14:37:04 crc kubenswrapper[4735]: I1008 14:37:04.908218 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18797497-e1df-4609-989b-b25a50cef42d" path="/var/lib/kubelet/pods/18797497-e1df-4609-989b-b25a50cef42d/volumes" Oct 08 14:37:05 crc kubenswrapper[4735]: I1008 14:37:05.341254 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88cbb49c-lcpt2" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 08 14:37:07 crc kubenswrapper[4735]: I1008 14:37:07.616741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:37:07 crc kubenswrapper[4735]: I1008 14:37:07.617052 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:37:08 crc kubenswrapper[4735]: I1008 14:37:08.906996 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-621e-account-create-8qpk6"] Oct 08 14:37:08 crc kubenswrapper[4735]: E1008 14:37:08.907707 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a1cc9c-ece5-4ac2-b809-31be10f2828e" containerName="mariadb-database-create" Oct 08 14:37:08 crc kubenswrapper[4735]: I1008 14:37:08.907723 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a1cc9c-ece5-4ac2-b809-31be10f2828e" containerName="mariadb-database-create" Oct 08 14:37:08 crc kubenswrapper[4735]: I1008 14:37:08.908001 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a1cc9c-ece5-4ac2-b809-31be10f2828e" containerName="mariadb-database-create" Oct 08 14:37:08 crc kubenswrapper[4735]: I1008 14:37:08.908872 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:08 crc kubenswrapper[4735]: I1008 14:37:08.911075 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 08 14:37:08 crc kubenswrapper[4735]: I1008 14:37:08.915779 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-621e-account-create-8qpk6"] Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.018661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kbqh\" (UniqueName: \"kubernetes.io/projected/01938786-d5c0-40ba-857f-edcb7ae6477e-kube-api-access-6kbqh\") pod \"heat-621e-account-create-8qpk6\" (UID: \"01938786-d5c0-40ba-857f-edcb7ae6477e\") " pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.120836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kbqh\" (UniqueName: \"kubernetes.io/projected/01938786-d5c0-40ba-857f-edcb7ae6477e-kube-api-access-6kbqh\") pod \"heat-621e-account-create-8qpk6\" (UID: \"01938786-d5c0-40ba-857f-edcb7ae6477e\") " pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.151584 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kbqh\" (UniqueName: \"kubernetes.io/projected/01938786-d5c0-40ba-857f-edcb7ae6477e-kube-api-access-6kbqh\") pod \"heat-621e-account-create-8qpk6\" (UID: \"01938786-d5c0-40ba-857f-edcb7ae6477e\") " pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.231510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.754647 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-621e-account-create-8qpk6"] Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.781249 4735 scope.go:117] "RemoveContainer" containerID="d57af3ab4184733cf538abae0a95240cdf5a30452f8bff100d7dc1586bf11b2d" Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.904530 4735 scope.go:117] "RemoveContainer" containerID="1247abbe2fd70efaba12ea959fe092fe23cf79aea99b71a0f7bec6382b9e28b2" Oct 08 14:37:09 crc kubenswrapper[4735]: I1008 14:37:09.934228 4735 scope.go:117] "RemoveContainer" containerID="91d2d5ec822b4469af663d2189584e590eba0141f23bbed98b64989e890edf48" Oct 08 14:37:10 crc kubenswrapper[4735]: I1008 14:37:10.301633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-621e-account-create-8qpk6" event={"ID":"01938786-d5c0-40ba-857f-edcb7ae6477e","Type":"ContainerStarted","Data":"fa3ea6651eae869a0462a13e01adee05d3815d3d5e2ac8c21692941c916b27e1"} Oct 08 14:37:10 crc kubenswrapper[4735]: I1008 14:37:10.301934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-621e-account-create-8qpk6" event={"ID":"01938786-d5c0-40ba-857f-edcb7ae6477e","Type":"ContainerStarted","Data":"38bdefd4f00a60f08c16994e6e98ecda9a83635692b9e024dfdb51a5502693d1"} Oct 08 14:37:10 crc kubenswrapper[4735]: I1008 14:37:10.387218 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-621e-account-create-8qpk6" podStartSLOduration=2.387195555 podStartE2EDuration="2.387195555s" podCreationTimestamp="2025-10-08 14:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:37:10.366871125 +0000 UTC m=+6080.175175669" watchObservedRunningTime="2025-10-08 14:37:10.387195555 +0000 UTC m=+6080.195500099" Oct 08 14:37:10 crc kubenswrapper[4735]: I1008 14:37:10.902581 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:37:10 crc kubenswrapper[4735]: E1008 14:37:10.903309 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:37:11 crc kubenswrapper[4735]: I1008 14:37:11.312109 4735 generic.go:334] "Generic (PLEG): container finished" podID="01938786-d5c0-40ba-857f-edcb7ae6477e" containerID="fa3ea6651eae869a0462a13e01adee05d3815d3d5e2ac8c21692941c916b27e1" exitCode=0 Oct 08 14:37:11 crc kubenswrapper[4735]: I1008 14:37:11.312223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-621e-account-create-8qpk6" event={"ID":"01938786-d5c0-40ba-857f-edcb7ae6477e","Type":"ContainerDied","Data":"fa3ea6651eae869a0462a13e01adee05d3815d3d5e2ac8c21692941c916b27e1"} Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.060603 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4kftk"] Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.075372 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4kftk"] Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.726093 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.832504 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kbqh\" (UniqueName: \"kubernetes.io/projected/01938786-d5c0-40ba-857f-edcb7ae6477e-kube-api-access-6kbqh\") pod \"01938786-d5c0-40ba-857f-edcb7ae6477e\" (UID: \"01938786-d5c0-40ba-857f-edcb7ae6477e\") " Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.839667 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01938786-d5c0-40ba-857f-edcb7ae6477e-kube-api-access-6kbqh" (OuterVolumeSpecName: "kube-api-access-6kbqh") pod "01938786-d5c0-40ba-857f-edcb7ae6477e" (UID: "01938786-d5c0-40ba-857f-edcb7ae6477e"). InnerVolumeSpecName "kube-api-access-6kbqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.908228 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a77a37-d98c-41ee-aa9e-e1b814507e2b" path="/var/lib/kubelet/pods/34a77a37-d98c-41ee-aa9e-e1b814507e2b/volumes" Oct 08 14:37:12 crc kubenswrapper[4735]: I1008 14:37:12.935427 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kbqh\" (UniqueName: \"kubernetes.io/projected/01938786-d5c0-40ba-857f-edcb7ae6477e-kube-api-access-6kbqh\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:13 crc kubenswrapper[4735]: I1008 14:37:13.334238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-621e-account-create-8qpk6" event={"ID":"01938786-d5c0-40ba-857f-edcb7ae6477e","Type":"ContainerDied","Data":"38bdefd4f00a60f08c16994e6e98ecda9a83635692b9e024dfdb51a5502693d1"} Oct 08 14:37:13 crc kubenswrapper[4735]: I1008 14:37:13.334584 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38bdefd4f00a60f08c16994e6e98ecda9a83635692b9e024dfdb51a5502693d1" Oct 08 14:37:13 crc kubenswrapper[4735]: I1008 14:37:13.334661 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-621e-account-create-8qpk6" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.063024 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-ffzjx"] Oct 08 14:37:14 crc kubenswrapper[4735]: E1008 14:37:14.063406 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01938786-d5c0-40ba-857f-edcb7ae6477e" containerName="mariadb-account-create" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.063418 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01938786-d5c0-40ba-857f-edcb7ae6477e" containerName="mariadb-account-create" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.063681 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="01938786-d5c0-40ba-857f-edcb7ae6477e" containerName="mariadb-account-create" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.064552 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.068706 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.068923 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-njfr2" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.084540 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-ffzjx"] Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.157439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-combined-ca-bundle\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.157508 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-config-data\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.157656 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhbg\" (UniqueName: \"kubernetes.io/projected/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-kube-api-access-xhhbg\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.259608 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhbg\" (UniqueName: \"kubernetes.io/projected/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-kube-api-access-xhhbg\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.260036 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-combined-ca-bundle\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.260083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-config-data\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.271788 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-config-data\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.273942 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-combined-ca-bundle\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.279696 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhbg\" (UniqueName: \"kubernetes.io/projected/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-kube-api-access-xhhbg\") pod \"heat-db-sync-ffzjx\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.388436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:14 crc kubenswrapper[4735]: I1008 14:37:14.854194 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-ffzjx"] Oct 08 14:37:15 crc kubenswrapper[4735]: I1008 14:37:15.346004 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b88cbb49c-lcpt2" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 08 14:37:15 crc kubenswrapper[4735]: I1008 14:37:15.346531 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:37:15 crc kubenswrapper[4735]: I1008 14:37:15.361658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ffzjx" event={"ID":"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8","Type":"ContainerStarted","Data":"c18dea644eef6721646d405a97dcb7af7616a2a2d5b2dac7cdc27f7853955464"} Oct 08 14:37:17 crc kubenswrapper[4735]: I1008 14:37:17.619254 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b46fd549c-8kjz4" podUID="f5f29de7-461a-4ece-b951-e8842bc684a9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Oct 08 14:37:20 crc kubenswrapper[4735]: I1008 14:37:20.419406 4735 generic.go:334] "Generic (PLEG): container finished" podID="a947221d-27b4-4177-8ac7-26a005f8995c" containerID="ccc337960433d2a19babae59e11e2078e5f1b3b7043b599a411b1290984b0348" exitCode=137 Oct 08 14:37:20 crc kubenswrapper[4735]: I1008 14:37:20.419514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88cbb49c-lcpt2" event={"ID":"a947221d-27b4-4177-8ac7-26a005f8995c","Type":"ContainerDied","Data":"ccc337960433d2a19babae59e11e2078e5f1b3b7043b599a411b1290984b0348"} Oct 08 14:37:22 crc kubenswrapper[4735]: I1008 14:37:22.954426 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.065197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a947221d-27b4-4177-8ac7-26a005f8995c-horizon-secret-key\") pod \"a947221d-27b4-4177-8ac7-26a005f8995c\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.066511 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-config-data\") pod \"a947221d-27b4-4177-8ac7-26a005f8995c\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.066723 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a947221d-27b4-4177-8ac7-26a005f8995c-logs\") pod \"a947221d-27b4-4177-8ac7-26a005f8995c\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.066860 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-scripts\") pod \"a947221d-27b4-4177-8ac7-26a005f8995c\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.067262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s64hc\" (UniqueName: \"kubernetes.io/projected/a947221d-27b4-4177-8ac7-26a005f8995c-kube-api-access-s64hc\") pod \"a947221d-27b4-4177-8ac7-26a005f8995c\" (UID: \"a947221d-27b4-4177-8ac7-26a005f8995c\") " Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.067283 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a947221d-27b4-4177-8ac7-26a005f8995c-logs" (OuterVolumeSpecName: "logs") pod "a947221d-27b4-4177-8ac7-26a005f8995c" (UID: "a947221d-27b4-4177-8ac7-26a005f8995c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.068184 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a947221d-27b4-4177-8ac7-26a005f8995c-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.075309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947221d-27b4-4177-8ac7-26a005f8995c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a947221d-27b4-4177-8ac7-26a005f8995c" (UID: "a947221d-27b4-4177-8ac7-26a005f8995c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.077709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a947221d-27b4-4177-8ac7-26a005f8995c-kube-api-access-s64hc" (OuterVolumeSpecName: "kube-api-access-s64hc") pod "a947221d-27b4-4177-8ac7-26a005f8995c" (UID: "a947221d-27b4-4177-8ac7-26a005f8995c"). InnerVolumeSpecName "kube-api-access-s64hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.090878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-scripts" (OuterVolumeSpecName: "scripts") pod "a947221d-27b4-4177-8ac7-26a005f8995c" (UID: "a947221d-27b4-4177-8ac7-26a005f8995c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.092825 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-config-data" (OuterVolumeSpecName: "config-data") pod "a947221d-27b4-4177-8ac7-26a005f8995c" (UID: "a947221d-27b4-4177-8ac7-26a005f8995c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.169894 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a947221d-27b4-4177-8ac7-26a005f8995c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.169929 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.169937 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a947221d-27b4-4177-8ac7-26a005f8995c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.169947 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s64hc\" (UniqueName: \"kubernetes.io/projected/a947221d-27b4-4177-8ac7-26a005f8995c-kube-api-access-s64hc\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.449723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ffzjx" event={"ID":"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8","Type":"ContainerStarted","Data":"726a5619111e4b895e246ae0b2a93c8d3e781779c317a92f79b7d83812170e16"} Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.452760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b88cbb49c-lcpt2" event={"ID":"a947221d-27b4-4177-8ac7-26a005f8995c","Type":"ContainerDied","Data":"9847a23056808cae3f7d1b3f770dcb92bd6b1691022c3f07c6e88b385af8153a"} Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.452796 4735 scope.go:117] "RemoveContainer" containerID="35300fc03b3eed040e927910001a3d073d55c60d6dd7d227d421c3a8296977e2" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.452959 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b88cbb49c-lcpt2" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.505110 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-ffzjx" podStartSLOduration=1.601697686 podStartE2EDuration="9.505084183s" podCreationTimestamp="2025-10-08 14:37:14 +0000 UTC" firstStartedPulling="2025-10-08 14:37:14.87432645 +0000 UTC m=+6084.682630994" lastFinishedPulling="2025-10-08 14:37:22.777712947 +0000 UTC m=+6092.586017491" observedRunningTime="2025-10-08 14:37:23.478232486 +0000 UTC m=+6093.286537040" watchObservedRunningTime="2025-10-08 14:37:23.505084183 +0000 UTC m=+6093.313388727" Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.505456 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b88cbb49c-lcpt2"] Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.526084 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b88cbb49c-lcpt2"] Oct 08 14:37:23 crc kubenswrapper[4735]: I1008 14:37:23.632663 4735 scope.go:117] "RemoveContainer" containerID="ccc337960433d2a19babae59e11e2078e5f1b3b7043b599a411b1290984b0348" Oct 08 14:37:24 crc kubenswrapper[4735]: I1008 14:37:24.905366 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" path="/var/lib/kubelet/pods/a947221d-27b4-4177-8ac7-26a005f8995c/volumes" Oct 08 14:37:25 crc kubenswrapper[4735]: I1008 14:37:25.477146 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" containerID="726a5619111e4b895e246ae0b2a93c8d3e781779c317a92f79b7d83812170e16" exitCode=0 Oct 08 14:37:25 crc kubenswrapper[4735]: I1008 14:37:25.477194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ffzjx" event={"ID":"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8","Type":"ContainerDied","Data":"726a5619111e4b895e246ae0b2a93c8d3e781779c317a92f79b7d83812170e16"} Oct 08 14:37:25 crc kubenswrapper[4735]: E1008 14:37:25.557059 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab0b01a_9c11_478e_825f_c2d2fc24b9e8.slice/crio-conmon-726a5619111e4b895e246ae0b2a93c8d3e781779c317a92f79b7d83812170e16.scope\": RecentStats: unable to find data in memory cache]" Oct 08 14:37:25 crc kubenswrapper[4735]: I1008 14:37:25.895398 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:37:25 crc kubenswrapper[4735]: E1008 14:37:25.895704 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:37:26 crc kubenswrapper[4735]: I1008 14:37:26.865711 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:26 crc kubenswrapper[4735]: I1008 14:37:26.937167 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-combined-ca-bundle\") pod \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " Oct 08 14:37:26 crc kubenswrapper[4735]: I1008 14:37:26.937318 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhhbg\" (UniqueName: \"kubernetes.io/projected/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-kube-api-access-xhhbg\") pod \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " Oct 08 14:37:26 crc kubenswrapper[4735]: I1008 14:37:26.937399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-config-data\") pod \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\" (UID: \"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8\") " Oct 08 14:37:26 crc kubenswrapper[4735]: I1008 14:37:26.942757 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-kube-api-access-xhhbg" (OuterVolumeSpecName: "kube-api-access-xhhbg") pod "9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" (UID: "9ab0b01a-9c11-478e-825f-c2d2fc24b9e8"). InnerVolumeSpecName "kube-api-access-xhhbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:37:26 crc kubenswrapper[4735]: I1008 14:37:26.966318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" (UID: "9ab0b01a-9c11-478e-825f-c2d2fc24b9e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.007345 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-config-data" (OuterVolumeSpecName: "config-data") pod "9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" (UID: "9ab0b01a-9c11-478e-825f-c2d2fc24b9e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.040200 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.040240 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhhbg\" (UniqueName: \"kubernetes.io/projected/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-kube-api-access-xhhbg\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.040253 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.504707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ffzjx" event={"ID":"9ab0b01a-9c11-478e-825f-c2d2fc24b9e8","Type":"ContainerDied","Data":"c18dea644eef6721646d405a97dcb7af7616a2a2d5b2dac7cdc27f7853955464"} Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.504749 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c18dea644eef6721646d405a97dcb7af7616a2a2d5b2dac7cdc27f7853955464" Oct 08 14:37:27 crc kubenswrapper[4735]: I1008 14:37:27.504827 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ffzjx" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.720893 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5764f7c644-f9hgd"] Oct 08 14:37:28 crc kubenswrapper[4735]: E1008 14:37:28.721664 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.721681 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" Oct 08 14:37:28 crc kubenswrapper[4735]: E1008 14:37:28.721702 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" containerName="heat-db-sync" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.721710 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" containerName="heat-db-sync" Oct 08 14:37:28 crc kubenswrapper[4735]: E1008 14:37:28.721741 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon-log" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.721752 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon-log" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.721986 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.722006 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a947221d-27b4-4177-8ac7-26a005f8995c" containerName="horizon-log" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.722029 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" containerName="heat-db-sync" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.722921 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.728285 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.729673 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.729938 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-njfr2" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.742709 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5764f7c644-f9hgd"] Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.822398 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-68df5d85c7-wngwm"] Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.823860 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.826251 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.835701 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68df5d85c7-wngwm"] Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.879431 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6fd9c587dd-m8gj4"] Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.880906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.884678 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.887282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-config-data\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.887330 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w576\" (UniqueName: \"kubernetes.io/projected/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-kube-api-access-2w576\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.887575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-config-data-custom\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.887641 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-combined-ca-bundle\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.892465 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6fd9c587dd-m8gj4"] Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfmn6\" (UniqueName: \"kubernetes.io/projected/0c5050eb-31e7-479f-8669-7c97bb659b55-kube-api-access-dfmn6\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989649 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-config-data\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w576\" (UniqueName: \"kubernetes.io/projected/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-kube-api-access-2w576\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkl45\" (UniqueName: \"kubernetes.io/projected/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-kube-api-access-hkl45\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-config-data\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-config-data\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-combined-ca-bundle\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-config-data-custom\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-config-data-custom\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-combined-ca-bundle\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989961 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-combined-ca-bundle\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.989984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-config-data-custom\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.995693 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-combined-ca-bundle\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:28 crc kubenswrapper[4735]: I1008 14:37:28.995745 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-config-data-custom\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.005679 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w576\" (UniqueName: \"kubernetes.io/projected/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-kube-api-access-2w576\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.007819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa238d2c-c63c-43cb-9e0c-6387abad2e1a-config-data\") pod \"heat-engine-5764f7c644-f9hgd\" (UID: \"fa238d2c-c63c-43cb-9e0c-6387abad2e1a\") " pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.059747 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.092589 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfmn6\" (UniqueName: \"kubernetes.io/projected/0c5050eb-31e7-479f-8669-7c97bb659b55-kube-api-access-dfmn6\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkl45\" (UniqueName: \"kubernetes.io/projected/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-kube-api-access-hkl45\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093508 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-config-data\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-config-data\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-combined-ca-bundle\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-config-data-custom\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-combined-ca-bundle\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.093830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-config-data-custom\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.098067 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-config-data\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.101003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-combined-ca-bundle\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.101594 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-config-data-custom\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.103031 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-config-data\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.103174 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5050eb-31e7-479f-8669-7c97bb659b55-combined-ca-bundle\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.104196 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-config-data-custom\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.113639 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfmn6\" (UniqueName: \"kubernetes.io/projected/0c5050eb-31e7-479f-8669-7c97bb659b55-kube-api-access-dfmn6\") pod \"heat-api-6fd9c587dd-m8gj4\" (UID: \"0c5050eb-31e7-479f-8669-7c97bb659b55\") " pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.122638 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkl45\" (UniqueName: \"kubernetes.io/projected/9f1aca0c-3f7f-4f0b-937f-56ba721f7f13-kube-api-access-hkl45\") pod \"heat-cfnapi-68df5d85c7-wngwm\" (UID: \"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13\") " pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.146642 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:29 crc kubenswrapper[4735]: I1008 14:37:29.213270 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:29.660956 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5764f7c644-f9hgd"] Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:29.782346 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68df5d85c7-wngwm"] Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:29.887643 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6fd9c587dd-m8gj4"] Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.081987 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.542541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5764f7c644-f9hgd" event={"ID":"fa238d2c-c63c-43cb-9e0c-6387abad2e1a","Type":"ContainerStarted","Data":"e67f54da991e0d1e4099b73c789071fa0e4d6c3688ed7f0cabf4497242bd23c2"} Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.543273 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5764f7c644-f9hgd" event={"ID":"fa238d2c-c63c-43cb-9e0c-6387abad2e1a","Type":"ContainerStarted","Data":"4923a7fbc90b2fc6cf3ccebdf6911b815fd480e6fedea1ce77b0fa09d72a7b8a"} Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.543309 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.546825 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" event={"ID":"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13","Type":"ContainerStarted","Data":"ed08b54e4a3f20550a4e68fb52460e290c9c1a098fbcd3a6a2c978ae42f360ca"} Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.551164 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd9c587dd-m8gj4" event={"ID":"0c5050eb-31e7-479f-8669-7c97bb659b55","Type":"ContainerStarted","Data":"58aec564c4336d89c6269fd705a615525312c65adc37a5c0671468c3627b989d"} Oct 08 14:37:30 crc kubenswrapper[4735]: I1008 14:37:30.563467 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5764f7c644-f9hgd" podStartSLOduration=2.563438256 podStartE2EDuration="2.563438256s" podCreationTimestamp="2025-10-08 14:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:37:30.557996036 +0000 UTC m=+6100.366300580" watchObservedRunningTime="2025-10-08 14:37:30.563438256 +0000 UTC m=+6100.371742800" Oct 08 14:37:32 crc kubenswrapper[4735]: I1008 14:37:32.037774 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b46fd549c-8kjz4" Oct 08 14:37:32 crc kubenswrapper[4735]: I1008 14:37:32.113182 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85dff5fbdf-tjmzh"] Oct 08 14:37:32 crc kubenswrapper[4735]: I1008 14:37:32.113440 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85dff5fbdf-tjmzh" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon-log" containerID="cri-o://0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e" gracePeriod=30 Oct 08 14:37:32 crc kubenswrapper[4735]: I1008 14:37:32.113607 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85dff5fbdf-tjmzh" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" containerID="cri-o://ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7" gracePeriod=30 Oct 08 14:37:33 crc kubenswrapper[4735]: I1008 14:37:33.578931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" event={"ID":"9f1aca0c-3f7f-4f0b-937f-56ba721f7f13","Type":"ContainerStarted","Data":"af98eeeb730918b10bbb5cc1cc5711b8e5c9e9a4a111e40b15acc2b43434f287"} Oct 08 14:37:33 crc kubenswrapper[4735]: I1008 14:37:33.579433 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:33 crc kubenswrapper[4735]: I1008 14:37:33.580696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6fd9c587dd-m8gj4" event={"ID":"0c5050eb-31e7-479f-8669-7c97bb659b55","Type":"ContainerStarted","Data":"fc1dd577e2677bbd7f33c6d7f7fb8be27b48bfdb1130a32346b0e268597b558e"} Oct 08 14:37:33 crc kubenswrapper[4735]: I1008 14:37:33.580853 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:33 crc kubenswrapper[4735]: I1008 14:37:33.601930 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" podStartSLOduration=3.114012908 podStartE2EDuration="5.601908316s" podCreationTimestamp="2025-10-08 14:37:28 +0000 UTC" firstStartedPulling="2025-10-08 14:37:29.787347374 +0000 UTC m=+6099.595651908" lastFinishedPulling="2025-10-08 14:37:32.275242772 +0000 UTC m=+6102.083547316" observedRunningTime="2025-10-08 14:37:33.596968249 +0000 UTC m=+6103.405272793" watchObservedRunningTime="2025-10-08 14:37:33.601908316 +0000 UTC m=+6103.410212860" Oct 08 14:37:33 crc kubenswrapper[4735]: I1008 14:37:33.622205 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6fd9c587dd-m8gj4" podStartSLOduration=3.240537855 podStartE2EDuration="5.622171854s" podCreationTimestamp="2025-10-08 14:37:28 +0000 UTC" firstStartedPulling="2025-10-08 14:37:29.896522577 +0000 UTC m=+6099.704827121" lastFinishedPulling="2025-10-08 14:37:32.278156576 +0000 UTC m=+6102.086461120" observedRunningTime="2025-10-08 14:37:33.613340378 +0000 UTC m=+6103.421644942" watchObservedRunningTime="2025-10-08 14:37:33.622171854 +0000 UTC m=+6103.430476398" Oct 08 14:37:35 crc kubenswrapper[4735]: I1008 14:37:35.601144 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerID="ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7" exitCode=0 Oct 08 14:37:35 crc kubenswrapper[4735]: I1008 14:37:35.601210 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85dff5fbdf-tjmzh" event={"ID":"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7","Type":"ContainerDied","Data":"ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7"} Oct 08 14:37:35 crc kubenswrapper[4735]: I1008 14:37:35.715753 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85dff5fbdf-tjmzh" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 08 14:37:39 crc kubenswrapper[4735]: I1008 14:37:39.086804 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5764f7c644-f9hgd" Oct 08 14:37:40 crc kubenswrapper[4735]: I1008 14:37:40.603098 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-68df5d85c7-wngwm" Oct 08 14:37:40 crc kubenswrapper[4735]: I1008 14:37:40.832938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6fd9c587dd-m8gj4" Oct 08 14:37:40 crc kubenswrapper[4735]: I1008 14:37:40.909799 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:37:40 crc kubenswrapper[4735]: E1008 14:37:40.911143 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:37:42 crc kubenswrapper[4735]: I1008 14:37:42.045531 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-d4zkz"] Oct 08 14:37:42 crc kubenswrapper[4735]: I1008 14:37:42.054332 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-d4zkz"] Oct 08 14:37:42 crc kubenswrapper[4735]: I1008 14:37:42.905714 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc" path="/var/lib/kubelet/pods/1b1a6ad9-ce4b-43f9-ae6e-f58bbde050bc/volumes" Oct 08 14:37:45 crc kubenswrapper[4735]: I1008 14:37:45.715858 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85dff5fbdf-tjmzh" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.475635 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2"] Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.478205 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.481344 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.484367 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2"] Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.640815 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.640956 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.641054 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmj49\" (UniqueName: \"kubernetes.io/projected/3ad5f24f-e316-420c-a602-3272e3d8e0d1-kube-api-access-wmj49\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.742503 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.742676 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmj49\" (UniqueName: \"kubernetes.io/projected/3ad5f24f-e316-420c-a602-3272e3d8e0d1-kube-api-access-wmj49\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.742728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.743098 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.744756 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.764846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmj49\" (UniqueName: \"kubernetes.io/projected/3ad5f24f-e316-420c-a602-3272e3d8e0d1-kube-api-access-wmj49\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:50 crc kubenswrapper[4735]: I1008 14:37:50.802390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:37:51 crc kubenswrapper[4735]: I1008 14:37:51.265870 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2"] Oct 08 14:37:51 crc kubenswrapper[4735]: W1008 14:37:51.269613 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad5f24f_e316_420c_a602_3272e3d8e0d1.slice/crio-980f2f811cfcd98805e23517a493b566fb2de7ba16eaa1e2e72c615d86e0b6c4 WatchSource:0}: Error finding container 980f2f811cfcd98805e23517a493b566fb2de7ba16eaa1e2e72c615d86e0b6c4: Status 404 returned error can't find the container with id 980f2f811cfcd98805e23517a493b566fb2de7ba16eaa1e2e72c615d86e0b6c4 Oct 08 14:37:51 crc kubenswrapper[4735]: I1008 14:37:51.770052 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerStarted","Data":"f0aae0e9ab9fd97578b53e70149cdea5ad9782fb4ff48a00d7f010b82d19f16a"} Oct 08 14:37:51 crc kubenswrapper[4735]: I1008 14:37:51.770123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerStarted","Data":"980f2f811cfcd98805e23517a493b566fb2de7ba16eaa1e2e72c615d86e0b6c4"} Oct 08 14:37:52 crc kubenswrapper[4735]: I1008 14:37:52.026046 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-cbfd-account-create-4mjzl"] Oct 08 14:37:52 crc kubenswrapper[4735]: I1008 14:37:52.034766 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-cbfd-account-create-4mjzl"] Oct 08 14:37:52 crc kubenswrapper[4735]: I1008 14:37:52.909333 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1748c0-046e-4af1-b62d-1115b49da20c" path="/var/lib/kubelet/pods/5f1748c0-046e-4af1-b62d-1115b49da20c/volumes" Oct 08 14:37:53 crc kubenswrapper[4735]: I1008 14:37:53.795392 4735 generic.go:334] "Generic (PLEG): container finished" podID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerID="f0aae0e9ab9fd97578b53e70149cdea5ad9782fb4ff48a00d7f010b82d19f16a" exitCode=0 Oct 08 14:37:53 crc kubenswrapper[4735]: I1008 14:37:53.795493 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerDied","Data":"f0aae0e9ab9fd97578b53e70149cdea5ad9782fb4ff48a00d7f010b82d19f16a"} Oct 08 14:37:53 crc kubenswrapper[4735]: I1008 14:37:53.895062 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:37:53 crc kubenswrapper[4735]: E1008 14:37:53.895834 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:37:55 crc kubenswrapper[4735]: I1008 14:37:55.720040 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85dff5fbdf-tjmzh" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Oct 08 14:37:55 crc kubenswrapper[4735]: I1008 14:37:55.720868 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:37:56 crc kubenswrapper[4735]: I1008 14:37:56.826173 4735 generic.go:334] "Generic (PLEG): container finished" podID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerID="725aa8f2e6b5b866f2e2f289f4a542f9efcee580f6eb4193c07e473ff0a07e45" exitCode=0 Oct 08 14:37:56 crc kubenswrapper[4735]: I1008 14:37:56.826229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerDied","Data":"725aa8f2e6b5b866f2e2f289f4a542f9efcee580f6eb4193c07e473ff0a07e45"} Oct 08 14:37:57 crc kubenswrapper[4735]: I1008 14:37:57.839298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerStarted","Data":"e245e61c4efd6be69a0f60dffb4d549a832ff4aa7306e5394dbf3ea673972dc4"} Oct 08 14:37:57 crc kubenswrapper[4735]: I1008 14:37:57.866157 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" podStartSLOduration=5.707506879 podStartE2EDuration="7.866140652s" podCreationTimestamp="2025-10-08 14:37:50 +0000 UTC" firstStartedPulling="2025-10-08 14:37:53.799009212 +0000 UTC m=+6123.607313756" lastFinishedPulling="2025-10-08 14:37:55.957642985 +0000 UTC m=+6125.765947529" observedRunningTime="2025-10-08 14:37:57.860260432 +0000 UTC m=+6127.668564986" watchObservedRunningTime="2025-10-08 14:37:57.866140652 +0000 UTC m=+6127.674445196" Oct 08 14:37:58 crc kubenswrapper[4735]: I1008 14:37:58.035173 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9zftk"] Oct 08 14:37:58 crc kubenswrapper[4735]: I1008 14:37:58.046303 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9zftk"] Oct 08 14:37:58 crc kubenswrapper[4735]: I1008 14:37:58.850622 4735 generic.go:334] "Generic (PLEG): container finished" podID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerID="e245e61c4efd6be69a0f60dffb4d549a832ff4aa7306e5394dbf3ea673972dc4" exitCode=0 Oct 08 14:37:58 crc kubenswrapper[4735]: I1008 14:37:58.850663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerDied","Data":"e245e61c4efd6be69a0f60dffb4d549a832ff4aa7306e5394dbf3ea673972dc4"} Oct 08 14:37:58 crc kubenswrapper[4735]: I1008 14:37:58.905070 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38" path="/var/lib/kubelet/pods/c8045cd0-8d98-4fb0-aedd-3bc65b1d7b38/volumes" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.262344 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.373801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-bundle\") pod \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.374779 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmj49\" (UniqueName: \"kubernetes.io/projected/3ad5f24f-e316-420c-a602-3272e3d8e0d1-kube-api-access-wmj49\") pod \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.374968 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-util\") pod \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\" (UID: \"3ad5f24f-e316-420c-a602-3272e3d8e0d1\") " Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.376566 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-bundle" (OuterVolumeSpecName: "bundle") pod "3ad5f24f-e316-420c-a602-3272e3d8e0d1" (UID: "3ad5f24f-e316-420c-a602-3272e3d8e0d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.387961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-util" (OuterVolumeSpecName: "util") pod "3ad5f24f-e316-420c-a602-3272e3d8e0d1" (UID: "3ad5f24f-e316-420c-a602-3272e3d8e0d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.392315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad5f24f-e316-420c-a602-3272e3d8e0d1-kube-api-access-wmj49" (OuterVolumeSpecName: "kube-api-access-wmj49") pod "3ad5f24f-e316-420c-a602-3272e3d8e0d1" (UID: "3ad5f24f-e316-420c-a602-3272e3d8e0d1"). InnerVolumeSpecName "kube-api-access-wmj49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.477983 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.478042 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmj49\" (UniqueName: \"kubernetes.io/projected/3ad5f24f-e316-420c-a602-3272e3d8e0d1-kube-api-access-wmj49\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.478054 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ad5f24f-e316-420c-a602-3272e3d8e0d1-util\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.873862 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" event={"ID":"3ad5f24f-e316-420c-a602-3272e3d8e0d1","Type":"ContainerDied","Data":"980f2f811cfcd98805e23517a493b566fb2de7ba16eaa1e2e72c615d86e0b6c4"} Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.873916 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980f2f811cfcd98805e23517a493b566fb2de7ba16eaa1e2e72c615d86e0b6c4" Oct 08 14:38:00 crc kubenswrapper[4735]: I1008 14:38:00.873969 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.613982 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.727319 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-horizon-secret-key\") pod \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.727380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8986\" (UniqueName: \"kubernetes.io/projected/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-kube-api-access-h8986\") pod \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.727426 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-scripts\") pod \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.728362 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-config-data\") pod \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.728444 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-logs\") pod \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\" (UID: \"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7\") " Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.728936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-logs" (OuterVolumeSpecName: "logs") pod "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" (UID: "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.729578 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-logs\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.732654 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" (UID: "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.733025 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-kube-api-access-h8986" (OuterVolumeSpecName: "kube-api-access-h8986") pod "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" (UID: "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7"). InnerVolumeSpecName "kube-api-access-h8986". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.754214 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-scripts" (OuterVolumeSpecName: "scripts") pod "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" (UID: "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.760766 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-config-data" (OuterVolumeSpecName: "config-data") pod "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" (UID: "0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.830944 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.831164 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8986\" (UniqueName: \"kubernetes.io/projected/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-kube-api-access-h8986\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.831221 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.831301 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.892429 4735 generic.go:334] "Generic (PLEG): container finished" podID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerID="0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e" exitCode=137 Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.892507 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85dff5fbdf-tjmzh" event={"ID":"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7","Type":"ContainerDied","Data":"0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e"} Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.892538 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85dff5fbdf-tjmzh" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.892557 4735 scope.go:117] "RemoveContainer" containerID="ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7" Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.892543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85dff5fbdf-tjmzh" event={"ID":"0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7","Type":"ContainerDied","Data":"b56abd8ef005541dbbd9133437482f04f3480782ad9dd137dc728e052c1e4a0f"} Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.936448 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85dff5fbdf-tjmzh"] Oct 08 14:38:02 crc kubenswrapper[4735]: I1008 14:38:02.946962 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85dff5fbdf-tjmzh"] Oct 08 14:38:03 crc kubenswrapper[4735]: I1008 14:38:03.071073 4735 scope.go:117] "RemoveContainer" containerID="0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e" Oct 08 14:38:03 crc kubenswrapper[4735]: I1008 14:38:03.215351 4735 scope.go:117] "RemoveContainer" containerID="ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7" Oct 08 14:38:03 crc kubenswrapper[4735]: E1008 14:38:03.215788 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7\": container with ID starting with ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7 not found: ID does not exist" containerID="ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7" Oct 08 14:38:03 crc kubenswrapper[4735]: I1008 14:38:03.215841 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7"} err="failed to get container status \"ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7\": rpc error: code = NotFound desc = could not find container \"ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7\": container with ID starting with ffe9a1b9bbaea760fdf0f412d097502671e78f51a2c446953cb757782335add7 not found: ID does not exist" Oct 08 14:38:03 crc kubenswrapper[4735]: I1008 14:38:03.215873 4735 scope.go:117] "RemoveContainer" containerID="0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e" Oct 08 14:38:03 crc kubenswrapper[4735]: E1008 14:38:03.216351 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e\": container with ID starting with 0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e not found: ID does not exist" containerID="0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e" Oct 08 14:38:03 crc kubenswrapper[4735]: I1008 14:38:03.216384 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e"} err="failed to get container status \"0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e\": rpc error: code = NotFound desc = could not find container \"0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e\": container with ID starting with 0994015ce49c050452ad175f68c7d3cbc0ae3b9f7474645c82e3b42c402cb60e not found: ID does not exist" Oct 08 14:38:04 crc kubenswrapper[4735]: I1008 14:38:04.907306 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" path="/var/lib/kubelet/pods/0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7/volumes" Oct 08 14:38:07 crc kubenswrapper[4735]: I1008 14:38:07.896439 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:38:07 crc kubenswrapper[4735]: E1008 14:38:07.897856 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:38:10 crc kubenswrapper[4735]: I1008 14:38:10.260824 4735 scope.go:117] "RemoveContainer" containerID="3c59c060a29f8d4240073cea3a259b5e477630b30ba6cdea9f0579b47ff48773" Oct 08 14:38:10 crc kubenswrapper[4735]: I1008 14:38:10.311823 4735 scope.go:117] "RemoveContainer" containerID="5e2bd3dd470fb0dbb4ca0ae7c84b4661bfab78a6a47eccb8dfbef58bad201925" Oct 08 14:38:10 crc kubenswrapper[4735]: I1008 14:38:10.374408 4735 scope.go:117] "RemoveContainer" containerID="41894952995a3b3c0a0eca17cf2bc077920971ed47b354f2041ec9b9da12a012" Oct 08 14:38:10 crc kubenswrapper[4735]: I1008 14:38:10.466911 4735 scope.go:117] "RemoveContainer" containerID="110bc7f6f7bfaf2cbc5fb42562e7fbc8d62992be329fd5e9129a36add1cb5d97" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.628023 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc"] Oct 08 14:38:12 crc kubenswrapper[4735]: E1008 14:38:12.632957 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="util" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633186 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="util" Oct 08 14:38:12 crc kubenswrapper[4735]: E1008 14:38:12.633210 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="pull" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633219 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="pull" Oct 08 14:38:12 crc kubenswrapper[4735]: E1008 14:38:12.633232 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon-log" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633240 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon-log" Oct 08 14:38:12 crc kubenswrapper[4735]: E1008 14:38:12.633254 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="extract" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633261 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="extract" Oct 08 14:38:12 crc kubenswrapper[4735]: E1008 14:38:12.633282 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633290 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633558 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad5f24f-e316-420c-a602-3272e3d8e0d1" containerName="extract" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633577 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.633606 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbe512e-e83e-4cb7-b38b-c50d6f32c8f7" containerName="horizon-log" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.634466 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.637059 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.637265 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-w66gl" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.637746 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.648215 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc"] Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.751089 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5"] Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.751584 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pn8\" (UniqueName: \"kubernetes.io/projected/df4c8416-1970-48fb-a03f-61fc1addc8f7-kube-api-access-96pn8\") pod \"obo-prometheus-operator-7c8cf85677-6sklc\" (UID: \"df4c8416-1970-48fb-a03f-61fc1addc8f7\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.753184 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.756514 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-4k99s" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.756767 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.773662 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5"] Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.789853 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj"] Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.791663 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.807773 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj"] Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.853365 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pn8\" (UniqueName: \"kubernetes.io/projected/df4c8416-1970-48fb-a03f-61fc1addc8f7-kube-api-access-96pn8\") pod \"obo-prometheus-operator-7c8cf85677-6sklc\" (UID: \"df4c8416-1970-48fb-a03f-61fc1addc8f7\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.877345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pn8\" (UniqueName: \"kubernetes.io/projected/df4c8416-1970-48fb-a03f-61fc1addc8f7-kube-api-access-96pn8\") pod \"obo-prometheus-operator-7c8cf85677-6sklc\" (UID: \"df4c8416-1970-48fb-a03f-61fc1addc8f7\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.954879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41a64cf2-ae40-4e4b-a8d3-6e18f9207a16-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj\" (UID: \"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.955244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/daa71edb-4564-4793-bfc0-8499e44e2b37-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5\" (UID: \"daa71edb-4564-4793-bfc0-8499e44e2b37\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.956186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41a64cf2-ae40-4e4b-a8d3-6e18f9207a16-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj\" (UID: \"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.956893 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/daa71edb-4564-4793-bfc0-8499e44e2b37-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5\" (UID: \"daa71edb-4564-4793-bfc0-8499e44e2b37\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.958199 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-68pfv"] Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.959584 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.962954 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.966259 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-27bt6" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.968030 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 08 14:38:12 crc kubenswrapper[4735]: I1008 14:38:12.981717 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-68pfv"] Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.058554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41a64cf2-ae40-4e4b-a8d3-6e18f9207a16-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj\" (UID: \"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.058909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/daa71edb-4564-4793-bfc0-8499e44e2b37-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5\" (UID: \"daa71edb-4564-4793-bfc0-8499e44e2b37\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.059020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41a64cf2-ae40-4e4b-a8d3-6e18f9207a16-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj\" (UID: \"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.059105 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/daa71edb-4564-4793-bfc0-8499e44e2b37-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5\" (UID: \"daa71edb-4564-4793-bfc0-8499e44e2b37\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.062563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/daa71edb-4564-4793-bfc0-8499e44e2b37-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5\" (UID: \"daa71edb-4564-4793-bfc0-8499e44e2b37\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.062622 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41a64cf2-ae40-4e4b-a8d3-6e18f9207a16-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj\" (UID: \"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.062883 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41a64cf2-ae40-4e4b-a8d3-6e18f9207a16-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj\" (UID: \"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.065637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/daa71edb-4564-4793-bfc0-8499e44e2b37-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5\" (UID: \"daa71edb-4564-4793-bfc0-8499e44e2b37\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.075601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.128781 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.161519 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec3f9923-3121-4608-8145-ba70fad25a5e-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-68pfv\" (UID: \"ec3f9923-3121-4608-8145-ba70fad25a5e\") " pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.161606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnkkt\" (UniqueName: \"kubernetes.io/projected/ec3f9923-3121-4608-8145-ba70fad25a5e-kube-api-access-rnkkt\") pod \"observability-operator-cc5f78dfc-68pfv\" (UID: \"ec3f9923-3121-4608-8145-ba70fad25a5e\") " pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.239555 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-kc8w9"] Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.241220 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.252855 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-8mvl4" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.254125 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-kc8w9"] Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.269630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec3f9923-3121-4608-8145-ba70fad25a5e-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-68pfv\" (UID: \"ec3f9923-3121-4608-8145-ba70fad25a5e\") " pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.269995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnkkt\" (UniqueName: \"kubernetes.io/projected/ec3f9923-3121-4608-8145-ba70fad25a5e-kube-api-access-rnkkt\") pod \"observability-operator-cc5f78dfc-68pfv\" (UID: \"ec3f9923-3121-4608-8145-ba70fad25a5e\") " pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.270035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeb8e4fe-df75-46aa-9f6a-6760a53df37b-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-kc8w9\" (UID: \"eeb8e4fe-df75-46aa-9f6a-6760a53df37b\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.270111 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6zj\" (UniqueName: \"kubernetes.io/projected/eeb8e4fe-df75-46aa-9f6a-6760a53df37b-kube-api-access-xb6zj\") pod \"perses-operator-54bc95c9fb-kc8w9\" (UID: \"eeb8e4fe-df75-46aa-9f6a-6760a53df37b\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.285568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec3f9923-3121-4608-8145-ba70fad25a5e-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-68pfv\" (UID: \"ec3f9923-3121-4608-8145-ba70fad25a5e\") " pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.310466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnkkt\" (UniqueName: \"kubernetes.io/projected/ec3f9923-3121-4608-8145-ba70fad25a5e-kube-api-access-rnkkt\") pod \"observability-operator-cc5f78dfc-68pfv\" (UID: \"ec3f9923-3121-4608-8145-ba70fad25a5e\") " pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.328554 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.379457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeb8e4fe-df75-46aa-9f6a-6760a53df37b-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-kc8w9\" (UID: \"eeb8e4fe-df75-46aa-9f6a-6760a53df37b\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.379614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6zj\" (UniqueName: \"kubernetes.io/projected/eeb8e4fe-df75-46aa-9f6a-6760a53df37b-kube-api-access-xb6zj\") pod \"perses-operator-54bc95c9fb-kc8w9\" (UID: \"eeb8e4fe-df75-46aa-9f6a-6760a53df37b\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.380387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeb8e4fe-df75-46aa-9f6a-6760a53df37b-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-kc8w9\" (UID: \"eeb8e4fe-df75-46aa-9f6a-6760a53df37b\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.394929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6zj\" (UniqueName: \"kubernetes.io/projected/eeb8e4fe-df75-46aa-9f6a-6760a53df37b-kube-api-access-xb6zj\") pod \"perses-operator-54bc95c9fb-kc8w9\" (UID: \"eeb8e4fe-df75-46aa-9f6a-6760a53df37b\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.613034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.709334 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc"] Oct 08 14:38:13 crc kubenswrapper[4735]: W1008 14:38:13.768861 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4c8416_1970_48fb_a03f_61fc1addc8f7.slice/crio-57f321245d80e3e61cfa17e2f07611d62662e74c52b22151c0b8cf70366c95a4 WatchSource:0}: Error finding container 57f321245d80e3e61cfa17e2f07611d62662e74c52b22151c0b8cf70366c95a4: Status 404 returned error can't find the container with id 57f321245d80e3e61cfa17e2f07611d62662e74c52b22151c0b8cf70366c95a4 Oct 08 14:38:13 crc kubenswrapper[4735]: I1008 14:38:13.901451 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5"] Oct 08 14:38:14 crc kubenswrapper[4735]: I1008 14:38:14.039522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" event={"ID":"daa71edb-4564-4793-bfc0-8499e44e2b37","Type":"ContainerStarted","Data":"8d6526832173b1d130f5f3fb4a8b1d4cb5fbf6914ec3430feaa7cd5fa45baa9e"} Oct 08 14:38:14 crc kubenswrapper[4735]: I1008 14:38:14.040707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" event={"ID":"df4c8416-1970-48fb-a03f-61fc1addc8f7","Type":"ContainerStarted","Data":"57f321245d80e3e61cfa17e2f07611d62662e74c52b22151c0b8cf70366c95a4"} Oct 08 14:38:14 crc kubenswrapper[4735]: I1008 14:38:14.127748 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-68pfv"] Oct 08 14:38:14 crc kubenswrapper[4735]: I1008 14:38:14.181646 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj"] Oct 08 14:38:14 crc kubenswrapper[4735]: I1008 14:38:14.235667 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-kc8w9"] Oct 08 14:38:14 crc kubenswrapper[4735]: W1008 14:38:14.243176 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb8e4fe_df75_46aa_9f6a_6760a53df37b.slice/crio-1d39ab70c9fead7257a81c72945bf8677bfc1c3ef042237d4d8a70cdc93dd146 WatchSource:0}: Error finding container 1d39ab70c9fead7257a81c72945bf8677bfc1c3ef042237d4d8a70cdc93dd146: Status 404 returned error can't find the container with id 1d39ab70c9fead7257a81c72945bf8677bfc1c3ef042237d4d8a70cdc93dd146 Oct 08 14:38:15 crc kubenswrapper[4735]: I1008 14:38:15.066031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" event={"ID":"eeb8e4fe-df75-46aa-9f6a-6760a53df37b","Type":"ContainerStarted","Data":"1d39ab70c9fead7257a81c72945bf8677bfc1c3ef042237d4d8a70cdc93dd146"} Oct 08 14:38:15 crc kubenswrapper[4735]: I1008 14:38:15.069980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" event={"ID":"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16","Type":"ContainerStarted","Data":"e00a0e04f4c4e9bbafbaba56eb931901bd22e8c94b32e7809b770d203d918528"} Oct 08 14:38:15 crc kubenswrapper[4735]: I1008 14:38:15.071412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" event={"ID":"ec3f9923-3121-4608-8145-ba70fad25a5e","Type":"ContainerStarted","Data":"b49fa94936f1769e9418efaba56e1f2ccfc6ca07ce209ae08f96f3c1d254828b"} Oct 08 14:38:18 crc kubenswrapper[4735]: I1008 14:38:18.895084 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:38:18 crc kubenswrapper[4735]: E1008 14:38:18.896007 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.239268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" event={"ID":"ec3f9923-3121-4608-8145-ba70fad25a5e","Type":"ContainerStarted","Data":"86a2a056a6463c067095e3411a86da301593fffb3deaaf09f4cb2c5fcfbabfc7"} Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.240016 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.241584 4735 patch_prober.go:28] interesting pod/observability-operator-cc5f78dfc-68pfv container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.124:8081/healthz\": dial tcp 10.217.1.124:8081: connect: connection refused" start-of-body= Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.241626 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" podUID="ec3f9923-3121-4608-8145-ba70fad25a5e" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.124:8081/healthz\": dial tcp 10.217.1.124:8081: connect: connection refused" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.242566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" event={"ID":"eeb8e4fe-df75-46aa-9f6a-6760a53df37b","Type":"ContainerStarted","Data":"47caea1a374f591db4eb28bc31c9c18d3b8cede54b4243aff487be892c8a9fe0"} Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.243765 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.250904 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" event={"ID":"41a64cf2-ae40-4e4b-a8d3-6e18f9207a16","Type":"ContainerStarted","Data":"3194096fd11322ab073392791d4f992478c275eb3a94bfaec1f468247a196539"} Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.263945 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" podStartSLOduration=2.663796445 podStartE2EDuration="13.263929352s" podCreationTimestamp="2025-10-08 14:38:12 +0000 UTC" firstStartedPulling="2025-10-08 14:38:14.100535777 +0000 UTC m=+6143.908840321" lastFinishedPulling="2025-10-08 14:38:24.700668684 +0000 UTC m=+6154.508973228" observedRunningTime="2025-10-08 14:38:25.261962862 +0000 UTC m=+6155.070267406" watchObservedRunningTime="2025-10-08 14:38:25.263929352 +0000 UTC m=+6155.072233896" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.291930 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj" podStartSLOduration=2.990403239 podStartE2EDuration="13.291907778s" podCreationTimestamp="2025-10-08 14:38:12 +0000 UTC" firstStartedPulling="2025-10-08 14:38:14.21365818 +0000 UTC m=+6144.021962724" lastFinishedPulling="2025-10-08 14:38:24.515162719 +0000 UTC m=+6154.323467263" observedRunningTime="2025-10-08 14:38:25.280817414 +0000 UTC m=+6155.089121978" watchObservedRunningTime="2025-10-08 14:38:25.291907778 +0000 UTC m=+6155.100212322" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.305828 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" podStartSLOduration=2.704228208 podStartE2EDuration="13.305807543s" podCreationTimestamp="2025-10-08 14:38:12 +0000 UTC" firstStartedPulling="2025-10-08 14:38:13.913669687 +0000 UTC m=+6143.721974231" lastFinishedPulling="2025-10-08 14:38:24.515249012 +0000 UTC m=+6154.323553566" observedRunningTime="2025-10-08 14:38:25.303982927 +0000 UTC m=+6155.112287471" watchObservedRunningTime="2025-10-08 14:38:25.305807543 +0000 UTC m=+6155.114112107" Oct 08 14:38:25 crc kubenswrapper[4735]: I1008 14:38:25.334225 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" podStartSLOduration=2.054630412 podStartE2EDuration="12.33420885s" podCreationTimestamp="2025-10-08 14:38:13 +0000 UTC" firstStartedPulling="2025-10-08 14:38:14.247694051 +0000 UTC m=+6144.055998595" lastFinishedPulling="2025-10-08 14:38:24.527272489 +0000 UTC m=+6154.335577033" observedRunningTime="2025-10-08 14:38:25.331990243 +0000 UTC m=+6155.140294807" watchObservedRunningTime="2025-10-08 14:38:25.33420885 +0000 UTC m=+6155.142513394" Oct 08 14:38:26 crc kubenswrapper[4735]: I1008 14:38:26.261931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" event={"ID":"df4c8416-1970-48fb-a03f-61fc1addc8f7","Type":"ContainerStarted","Data":"e2df14932499baded4f614136c86b24cd026c2087c51211ee1562b6c524a7350"} Oct 08 14:38:26 crc kubenswrapper[4735]: I1008 14:38:26.264071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5" event={"ID":"daa71edb-4564-4793-bfc0-8499e44e2b37","Type":"ContainerStarted","Data":"fb95e50027901eb61b72ec61a76f1ac8125745b271db51d9b59f2fdd4ab79fd8"} Oct 08 14:38:26 crc kubenswrapper[4735]: I1008 14:38:26.268088 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-68pfv" Oct 08 14:38:26 crc kubenswrapper[4735]: I1008 14:38:26.365322 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-6sklc" podStartSLOduration=3.593325961 podStartE2EDuration="14.34559065s" podCreationTimestamp="2025-10-08 14:38:12 +0000 UTC" firstStartedPulling="2025-10-08 14:38:13.779535196 +0000 UTC m=+6143.587839740" lastFinishedPulling="2025-10-08 14:38:24.531799885 +0000 UTC m=+6154.340104429" observedRunningTime="2025-10-08 14:38:26.335600704 +0000 UTC m=+6156.143905248" watchObservedRunningTime="2025-10-08 14:38:26.34559065 +0000 UTC m=+6156.153895194" Oct 08 14:38:30 crc kubenswrapper[4735]: I1008 14:38:30.895288 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:38:30 crc kubenswrapper[4735]: E1008 14:38:30.896002 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:38:33 crc kubenswrapper[4735]: I1008 14:38:33.617003 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-kc8w9" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.672221 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.672810 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" containerName="openstackclient" containerID="cri-o://cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901" gracePeriod=2 Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.695240 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.717884 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 14:38:36 crc kubenswrapper[4735]: E1008 14:38:36.718610 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" containerName="openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.718642 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" containerName="openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.718983 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" containerName="openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.720312 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.732497 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.739888 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tvb7c" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.766623 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.768436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.783120 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" podUID="b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.789548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.860684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpf6j\" (UniqueName: \"kubernetes.io/projected/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-kube-api-access-dpf6j\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.860998 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.861097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-openstack-config\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.861184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2pv\" (UniqueName: \"kubernetes.io/projected/65c37583-c56d-460a-a126-d01c64155288-kube-api-access-6w2pv\") pod \"kube-state-metrics-0\" (UID: \"65c37583-c56d-460a-a126-d01c64155288\") " pod="openstack/kube-state-metrics-0" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.968080 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2pv\" (UniqueName: \"kubernetes.io/projected/65c37583-c56d-460a-a126-d01c64155288-kube-api-access-6w2pv\") pod \"kube-state-metrics-0\" (UID: \"65c37583-c56d-460a-a126-d01c64155288\") " pod="openstack/kube-state-metrics-0" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.968351 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpf6j\" (UniqueName: \"kubernetes.io/projected/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-kube-api-access-dpf6j\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.968501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.968556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-openstack-config\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:36 crc kubenswrapper[4735]: I1008 14:38:36.969437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-openstack-config\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.009782 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpf6j\" (UniqueName: \"kubernetes.io/projected/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-kube-api-access-dpf6j\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.010391 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2pv\" (UniqueName: \"kubernetes.io/projected/65c37583-c56d-460a-a126-d01c64155288-kube-api-access-6w2pv\") pod \"kube-state-metrics-0\" (UID: \"65c37583-c56d-460a-a126-d01c64155288\") " pod="openstack/kube-state-metrics-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.013043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0\") " pod="openstack/openstackclient" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.091998 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.122919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.567359 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.579211 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.581055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-5t7g2" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.581245 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.584679 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.584818 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.696970 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91970db8-91a6-4c32-ab99-6c72d08118bb-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.697020 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/91970db8-91a6-4c32-ab99-6c72d08118bb-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.697073 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlr44\" (UniqueName: \"kubernetes.io/projected/91970db8-91a6-4c32-ab99-6c72d08118bb-kube-api-access-wlr44\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.697128 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/91970db8-91a6-4c32-ab99-6c72d08118bb-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.697189 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91970db8-91a6-4c32-ab99-6c72d08118bb-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.697285 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91970db8-91a6-4c32-ab99-6c72d08118bb-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.798828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlr44\" (UniqueName: \"kubernetes.io/projected/91970db8-91a6-4c32-ab99-6c72d08118bb-kube-api-access-wlr44\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.798917 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/91970db8-91a6-4c32-ab99-6c72d08118bb-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.799002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91970db8-91a6-4c32-ab99-6c72d08118bb-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.799122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91970db8-91a6-4c32-ab99-6c72d08118bb-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.799180 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91970db8-91a6-4c32-ab99-6c72d08118bb-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.799207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/91970db8-91a6-4c32-ab99-6c72d08118bb-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.806074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/91970db8-91a6-4c32-ab99-6c72d08118bb-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.816211 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91970db8-91a6-4c32-ab99-6c72d08118bb-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.816210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/91970db8-91a6-4c32-ab99-6c72d08118bb-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.816971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91970db8-91a6-4c32-ab99-6c72d08118bb-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.853271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlr44\" (UniqueName: \"kubernetes.io/projected/91970db8-91a6-4c32-ab99-6c72d08118bb-kube-api-access-wlr44\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:37 crc kubenswrapper[4735]: I1008 14:38:37.853458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91970db8-91a6-4c32-ab99-6c72d08118bb-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"91970db8-91a6-4c32-ab99-6c72d08118bb\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.032977 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.138016 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.143736 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.171124 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.171157 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.171318 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.171391 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.171529 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.171573 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-ccddj" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.241640 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.269586 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329136 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329329 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhpwp\" (UniqueName: \"kubernetes.io/projected/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-kube-api-access-dhpwp\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329388 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-config\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329863 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.329891 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.331196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.379130 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.432975 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhpwp\" (UniqueName: \"kubernetes.io/projected/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-kube-api-access-dhpwp\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-config\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433201 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433249 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433309 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.433338 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.434386 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.442797 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.443595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.444039 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.457888 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.457932 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c21a6b4663a397c9db820c745e050c855e259dcb62441499e5ac223b1165c8d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.458223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.476210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhpwp\" (UniqueName: \"kubernetes.io/projected/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-kube-api-access-dhpwp\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.488207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/14478da8-1fcc-4ebe-96d2-3b3e6b9c9034-config\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.517625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"65c37583-c56d-460a-a126-d01c64155288","Type":"ContainerStarted","Data":"f2ef64d04aaaaab997bf85c0a34693773020798efe57c2f852445118107b36e3"} Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.626364 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e770fc4b-de30-49ae-9a43-116c28ca28ff\") pod \"prometheus-metric-storage-0\" (UID: \"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034\") " pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.642455 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.734400 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 14:38:38 crc kubenswrapper[4735]: I1008 14:38:38.968526 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.170722 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.251835 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config-secret\") pod \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.252262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfg5m\" (UniqueName: \"kubernetes.io/projected/c5d16f46-7748-41b5-9dfc-dd0e983a562c-kube-api-access-kfg5m\") pod \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.252366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config\") pod \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\" (UID: \"c5d16f46-7748-41b5-9dfc-dd0e983a562c\") " Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.258650 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d16f46-7748-41b5-9dfc-dd0e983a562c-kube-api-access-kfg5m" (OuterVolumeSpecName: "kube-api-access-kfg5m") pod "c5d16f46-7748-41b5-9dfc-dd0e983a562c" (UID: "c5d16f46-7748-41b5-9dfc-dd0e983a562c"). InnerVolumeSpecName "kube-api-access-kfg5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.307722 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c5d16f46-7748-41b5-9dfc-dd0e983a562c" (UID: "c5d16f46-7748-41b5-9dfc-dd0e983a562c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.307916 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.349681 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c5d16f46-7748-41b5-9dfc-dd0e983a562c" (UID: "c5d16f46-7748-41b5-9dfc-dd0e983a562c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.359084 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfg5m\" (UniqueName: \"kubernetes.io/projected/c5d16f46-7748-41b5-9dfc-dd0e983a562c-kube-api-access-kfg5m\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.359727 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.359812 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5d16f46-7748-41b5-9dfc-dd0e983a562c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.528003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034","Type":"ContainerStarted","Data":"9aec3e8e7c5919bf05e583fdaa2daf33740ebbc1df8102145ea4bf3fd8e07a0f"} Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.530722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"91970db8-91a6-4c32-ab99-6c72d08118bb","Type":"ContainerStarted","Data":"702d5a7a7d5cc781deee7f1f16fd03792b3195c7a145e1446d1053b2b8b7fcd9"} Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.532644 4735 generic.go:334] "Generic (PLEG): container finished" podID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" containerID="cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901" exitCode=137 Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.532722 4735 scope.go:117] "RemoveContainer" containerID="cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.532833 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.535339 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0","Type":"ContainerStarted","Data":"7c0808f1d9a0bf98387349923f4a61c269ece3de182106a0d47b8bc0f5123fde"} Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.535377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0","Type":"ContainerStarted","Data":"f091ad354e65a0c42487778c45074d091d527c8d3e05c3b606b75fb93cbb4c15"} Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.560972 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.5609495300000003 podStartE2EDuration="3.56094953s" podCreationTimestamp="2025-10-08 14:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:38:39.559553344 +0000 UTC m=+6169.367857888" watchObservedRunningTime="2025-10-08 14:38:39.56094953 +0000 UTC m=+6169.369254084" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.576411 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" podUID="b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.579168 4735 scope.go:117] "RemoveContainer" containerID="cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901" Oct 08 14:38:39 crc kubenswrapper[4735]: E1008 14:38:39.579759 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901\": container with ID starting with cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901 not found: ID does not exist" containerID="cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901" Oct 08 14:38:39 crc kubenswrapper[4735]: I1008 14:38:39.579799 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901"} err="failed to get container status \"cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901\": rpc error: code = NotFound desc = could not find container \"cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901\": container with ID starting with cddff410c705487c416391d53f0cee7883e70b0b097eaee2a3f6c07b3205d901 not found: ID does not exist" Oct 08 14:38:40 crc kubenswrapper[4735]: I1008 14:38:40.558990 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"65c37583-c56d-460a-a126-d01c64155288","Type":"ContainerStarted","Data":"77126138264e4509de999c85f538aaa8347dff32e1dd880219b6b223582ddcbd"} Oct 08 14:38:40 crc kubenswrapper[4735]: I1008 14:38:40.559345 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 14:38:40 crc kubenswrapper[4735]: I1008 14:38:40.583105 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.332936148 podStartE2EDuration="4.583078575s" podCreationTimestamp="2025-10-08 14:38:36 +0000 UTC" firstStartedPulling="2025-10-08 14:38:38.36740921 +0000 UTC m=+6168.175713754" lastFinishedPulling="2025-10-08 14:38:39.617551637 +0000 UTC m=+6169.425856181" observedRunningTime="2025-10-08 14:38:40.574000062 +0000 UTC m=+6170.382304616" watchObservedRunningTime="2025-10-08 14:38:40.583078575 +0000 UTC m=+6170.391383119" Oct 08 14:38:40 crc kubenswrapper[4735]: I1008 14:38:40.939524 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d16f46-7748-41b5-9dfc-dd0e983a562c" path="/var/lib/kubelet/pods/c5d16f46-7748-41b5-9dfc-dd0e983a562c/volumes" Oct 08 14:38:43 crc kubenswrapper[4735]: I1008 14:38:43.894738 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:38:43 crc kubenswrapper[4735]: E1008 14:38:43.895389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:38:45 crc kubenswrapper[4735]: I1008 14:38:45.608524 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"91970db8-91a6-4c32-ab99-6c72d08118bb","Type":"ContainerStarted","Data":"abdb8263984226d58c3d7ed49c724192cdd1900d88c2635179d1c149e7df894e"} Oct 08 14:38:45 crc kubenswrapper[4735]: I1008 14:38:45.611254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034","Type":"ContainerStarted","Data":"002b18cb00705acf02afc352e4174eebb2b0b00477195ad50cbd841bfe77ba5f"} Oct 08 14:38:47 crc kubenswrapper[4735]: I1008 14:38:47.095761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 14:38:52 crc kubenswrapper[4735]: I1008 14:38:52.708938 4735 generic.go:334] "Generic (PLEG): container finished" podID="91970db8-91a6-4c32-ab99-6c72d08118bb" containerID="abdb8263984226d58c3d7ed49c724192cdd1900d88c2635179d1c149e7df894e" exitCode=0 Oct 08 14:38:52 crc kubenswrapper[4735]: I1008 14:38:52.709020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"91970db8-91a6-4c32-ab99-6c72d08118bb","Type":"ContainerDied","Data":"abdb8263984226d58c3d7ed49c724192cdd1900d88c2635179d1c149e7df894e"} Oct 08 14:38:52 crc kubenswrapper[4735]: I1008 14:38:52.711183 4735 generic.go:334] "Generic (PLEG): container finished" podID="14478da8-1fcc-4ebe-96d2-3b3e6b9c9034" containerID="002b18cb00705acf02afc352e4174eebb2b0b00477195ad50cbd841bfe77ba5f" exitCode=0 Oct 08 14:38:52 crc kubenswrapper[4735]: I1008 14:38:52.711218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034","Type":"ContainerDied","Data":"002b18cb00705acf02afc352e4174eebb2b0b00477195ad50cbd841bfe77ba5f"} Oct 08 14:38:55 crc kubenswrapper[4735]: I1008 14:38:55.895975 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:38:55 crc kubenswrapper[4735]: E1008 14:38:55.896775 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:38:57 crc kubenswrapper[4735]: I1008 14:38:57.048923 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2hrm4"] Oct 08 14:38:57 crc kubenswrapper[4735]: I1008 14:38:57.057897 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2hrm4"] Oct 08 14:38:57 crc kubenswrapper[4735]: I1008 14:38:57.066641 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mh2k9"] Oct 08 14:38:57 crc kubenswrapper[4735]: I1008 14:38:57.078572 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-f6z9w"] Oct 08 14:38:57 crc kubenswrapper[4735]: I1008 14:38:57.088618 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mh2k9"] Oct 08 14:38:57 crc kubenswrapper[4735]: I1008 14:38:57.102177 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-f6z9w"] Oct 08 14:38:58 crc kubenswrapper[4735]: I1008 14:38:58.922887 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a" path="/var/lib/kubelet/pods/42c4c8f5-4b2e-438e-8b89-ec5eb3b9d01a/volumes" Oct 08 14:38:58 crc kubenswrapper[4735]: I1008 14:38:58.924328 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e95717-223a-491f-b378-be990176c7b0" path="/var/lib/kubelet/pods/42e95717-223a-491f-b378-be990176c7b0/volumes" Oct 08 14:38:58 crc kubenswrapper[4735]: I1008 14:38:58.925085 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f6de72-94b0-4fc8-b036-c53df917c0f8" path="/var/lib/kubelet/pods/64f6de72-94b0-4fc8-b036-c53df917c0f8/volumes" Oct 08 14:38:59 crc kubenswrapper[4735]: I1008 14:38:59.810873 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"91970db8-91a6-4c32-ab99-6c72d08118bb","Type":"ContainerStarted","Data":"a1df0ff7d9fb53ce1213b3e81a18d6a255350e1c72065dc17078c90ae47bd54a"} Oct 08 14:39:00 crc kubenswrapper[4735]: I1008 14:39:00.824676 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034","Type":"ContainerStarted","Data":"28cd6619e6d3128f01ce97778b5b9a9c0d2a72933ac1b21824b1bef816e44c1a"} Oct 08 14:39:04 crc kubenswrapper[4735]: I1008 14:39:04.864387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034","Type":"ContainerStarted","Data":"987acbf3910755b276dd79593fbc0098f7f04c7671eb67f4470f666ff4465b9a"} Oct 08 14:39:04 crc kubenswrapper[4735]: I1008 14:39:04.866772 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"91970db8-91a6-4c32-ab99-6c72d08118bb","Type":"ContainerStarted","Data":"d9e7e4b28cc33a84e30f801865e84ee868da648f01e98c6888658035ab28eed0"} Oct 08 14:39:04 crc kubenswrapper[4735]: I1008 14:39:04.867761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 08 14:39:04 crc kubenswrapper[4735]: I1008 14:39:04.870513 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 08 14:39:04 crc kubenswrapper[4735]: I1008 14:39:04.900118 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.370951904 podStartE2EDuration="27.900095991s" podCreationTimestamp="2025-10-08 14:38:37 +0000 UTC" firstStartedPulling="2025-10-08 14:38:38.959051514 +0000 UTC m=+6168.767356058" lastFinishedPulling="2025-10-08 14:38:59.488195591 +0000 UTC m=+6189.296500145" observedRunningTime="2025-10-08 14:39:04.890988278 +0000 UTC m=+6194.699292822" watchObservedRunningTime="2025-10-08 14:39:04.900095991 +0000 UTC m=+6194.708400535" Oct 08 14:39:07 crc kubenswrapper[4735]: I1008 14:39:07.039368 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0b6c-account-create-wrs4w"] Oct 08 14:39:07 crc kubenswrapper[4735]: I1008 14:39:07.051150 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f8ac-account-create-x5vzj"] Oct 08 14:39:07 crc kubenswrapper[4735]: I1008 14:39:07.059610 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4102-account-create-pb929"] Oct 08 14:39:07 crc kubenswrapper[4735]: I1008 14:39:07.073017 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f8ac-account-create-x5vzj"] Oct 08 14:39:07 crc kubenswrapper[4735]: I1008 14:39:07.083908 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0b6c-account-create-wrs4w"] Oct 08 14:39:07 crc kubenswrapper[4735]: I1008 14:39:07.093329 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4102-account-create-pb929"] Oct 08 14:39:08 crc kubenswrapper[4735]: I1008 14:39:08.907186 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e5378c-61f6-43b9-a69b-0a2004e44291" path="/var/lib/kubelet/pods/02e5378c-61f6-43b9-a69b-0a2004e44291/volumes" Oct 08 14:39:08 crc kubenswrapper[4735]: I1008 14:39:08.908784 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b895276-2adc-4f16-8a1f-04ed071342e7" path="/var/lib/kubelet/pods/5b895276-2adc-4f16-8a1f-04ed071342e7/volumes" Oct 08 14:39:08 crc kubenswrapper[4735]: I1008 14:39:08.909402 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8728623f-2ab6-4156-b915-a376fe953042" path="/var/lib/kubelet/pods/8728623f-2ab6-4156-b915-a376fe953042/volumes" Oct 08 14:39:08 crc kubenswrapper[4735]: I1008 14:39:08.912068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"14478da8-1fcc-4ebe-96d2-3b3e6b9c9034","Type":"ContainerStarted","Data":"cf8ec2ebee3dbd007f77ad9bc49083e95ef152a466fe151097a5884d35b58928"} Oct 08 14:39:08 crc kubenswrapper[4735]: I1008 14:39:08.933891 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.441506773 podStartE2EDuration="31.9338718s" podCreationTimestamp="2025-10-08 14:38:37 +0000 UTC" firstStartedPulling="2025-10-08 14:38:39.317542163 +0000 UTC m=+6169.125846707" lastFinishedPulling="2025-10-08 14:39:07.8099072 +0000 UTC m=+6197.618211734" observedRunningTime="2025-10-08 14:39:08.931002036 +0000 UTC m=+6198.739306590" watchObservedRunningTime="2025-10-08 14:39:08.9338718 +0000 UTC m=+6198.742176344" Oct 08 14:39:10 crc kubenswrapper[4735]: I1008 14:39:10.842529 4735 scope.go:117] "RemoveContainer" containerID="b8b583e4f64b8ee9bcd8e88460dffe6d69471197ec74c19a97363d7f12516e33" Oct 08 14:39:10 crc kubenswrapper[4735]: I1008 14:39:10.883403 4735 scope.go:117] "RemoveContainer" containerID="e277bbac2b51a462bdc21d57f77af0327826246317ad90fbd88ca1b0254596fe" Oct 08 14:39:10 crc kubenswrapper[4735]: I1008 14:39:10.902113 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:39:10 crc kubenswrapper[4735]: E1008 14:39:10.902491 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:39:10 crc kubenswrapper[4735]: I1008 14:39:10.943291 4735 scope.go:117] "RemoveContainer" containerID="39f5a32d43f9b17ff945b6cfb6210993702fa6aae857594cef2a83b0ab8c376b" Oct 08 14:39:10 crc kubenswrapper[4735]: I1008 14:39:10.986534 4735 scope.go:117] "RemoveContainer" containerID="7f610a6ca52fde1a70cf0ee9f627c6049dc699a4631062a89e62c8e869bb3362" Oct 08 14:39:11 crc kubenswrapper[4735]: I1008 14:39:11.026103 4735 scope.go:117] "RemoveContainer" containerID="fc36b142d01cc7ac85be46df09c036d26a2c51fa3edb11b78ba9db596eedb5b1" Oct 08 14:39:11 crc kubenswrapper[4735]: I1008 14:39:11.075323 4735 scope.go:117] "RemoveContainer" containerID="98eb2a7a37aefaa11b22cfcb4dac9d1fe9e33a121b719a1c66c890a7e68d3884" Oct 08 14:39:13 crc kubenswrapper[4735]: I1008 14:39:13.642740 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.734598 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.740976 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.753435 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.757331 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.792967 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873015 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-scripts\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-log-httpd\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjj99\" (UniqueName: \"kubernetes.io/projected/7fe43c7c-e920-40b6-8625-704c0ac19dbd-kube-api-access-fjj99\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-run-httpd\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873256 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.873352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-config-data\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.974753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-scripts\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.974849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-log-httpd\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.974871 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjj99\" (UniqueName: \"kubernetes.io/projected/7fe43c7c-e920-40b6-8625-704c0ac19dbd-kube-api-access-fjj99\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.974964 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-run-httpd\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.975019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.975079 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.975211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-config-data\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.975980 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-run-httpd\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.976546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-log-httpd\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.987646 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.992046 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.992392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-scripts\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:15 crc kubenswrapper[4735]: I1008 14:39:15.995709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-config-data\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:16 crc kubenswrapper[4735]: I1008 14:39:15.998411 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjj99\" (UniqueName: \"kubernetes.io/projected/7fe43c7c-e920-40b6-8625-704c0ac19dbd-kube-api-access-fjj99\") pod \"ceilometer-0\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " pod="openstack/ceilometer-0" Oct 08 14:39:16 crc kubenswrapper[4735]: I1008 14:39:16.089805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:39:16 crc kubenswrapper[4735]: W1008 14:39:16.891825 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fe43c7c_e920_40b6_8625_704c0ac19dbd.slice/crio-57fae84403d6c73b42113ca535874923abc8341e3c9b61838a281348911b8d6f WatchSource:0}: Error finding container 57fae84403d6c73b42113ca535874923abc8341e3c9b61838a281348911b8d6f: Status 404 returned error can't find the container with id 57fae84403d6c73b42113ca535874923abc8341e3c9b61838a281348911b8d6f Oct 08 14:39:16 crc kubenswrapper[4735]: I1008 14:39:16.896564 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:39:16 crc kubenswrapper[4735]: I1008 14:39:16.906372 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:39:17 crc kubenswrapper[4735]: I1008 14:39:17.002183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerStarted","Data":"57fae84403d6c73b42113ca535874923abc8341e3c9b61838a281348911b8d6f"} Oct 08 14:39:18 crc kubenswrapper[4735]: I1008 14:39:18.058878 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8gzl"] Oct 08 14:39:18 crc kubenswrapper[4735]: I1008 14:39:18.074220 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8gzl"] Oct 08 14:39:18 crc kubenswrapper[4735]: I1008 14:39:18.905974 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2f1bfd-72c0-4de9-a29e-a8258afbbf89" path="/var/lib/kubelet/pods/bc2f1bfd-72c0-4de9-a29e-a8258afbbf89/volumes" Oct 08 14:39:20 crc kubenswrapper[4735]: I1008 14:39:20.032376 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerStarted","Data":"f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe"} Oct 08 14:39:21 crc kubenswrapper[4735]: I1008 14:39:21.049551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerStarted","Data":"677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca"} Oct 08 14:39:23 crc kubenswrapper[4735]: I1008 14:39:23.071762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerStarted","Data":"1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f"} Oct 08 14:39:23 crc kubenswrapper[4735]: I1008 14:39:23.643118 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 08 14:39:23 crc kubenswrapper[4735]: I1008 14:39:23.647517 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 08 14:39:24 crc kubenswrapper[4735]: I1008 14:39:24.083583 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 08 14:39:24 crc kubenswrapper[4735]: I1008 14:39:24.894567 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:39:24 crc kubenswrapper[4735]: E1008 14:39:24.895558 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:39:25 crc kubenswrapper[4735]: I1008 14:39:25.094671 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerStarted","Data":"d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5"} Oct 08 14:39:25 crc kubenswrapper[4735]: I1008 14:39:25.095310 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 14:39:25 crc kubenswrapper[4735]: I1008 14:39:25.117150 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.510057327 podStartE2EDuration="10.117134426s" podCreationTimestamp="2025-10-08 14:39:15 +0000 UTC" firstStartedPulling="2025-10-08 14:39:16.896287747 +0000 UTC m=+6206.704592291" lastFinishedPulling="2025-10-08 14:39:24.503364846 +0000 UTC m=+6214.311669390" observedRunningTime="2025-10-08 14:39:25.115249298 +0000 UTC m=+6214.923553842" watchObservedRunningTime="2025-10-08 14:39:25.117134426 +0000 UTC m=+6214.925438970" Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.570810 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-vphsj"] Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.573208 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.581340 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-vphsj"] Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.657972 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbdkk\" (UniqueName: \"kubernetes.io/projected/b44fabde-cb8a-49d6-baaf-5b77929fa58b-kube-api-access-tbdkk\") pod \"aodh-db-create-vphsj\" (UID: \"b44fabde-cb8a-49d6-baaf-5b77929fa58b\") " pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.761021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbdkk\" (UniqueName: \"kubernetes.io/projected/b44fabde-cb8a-49d6-baaf-5b77929fa58b-kube-api-access-tbdkk\") pod \"aodh-db-create-vphsj\" (UID: \"b44fabde-cb8a-49d6-baaf-5b77929fa58b\") " pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.789917 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbdkk\" (UniqueName: \"kubernetes.io/projected/b44fabde-cb8a-49d6-baaf-5b77929fa58b-kube-api-access-tbdkk\") pod \"aodh-db-create-vphsj\" (UID: \"b44fabde-cb8a-49d6-baaf-5b77929fa58b\") " pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:32 crc kubenswrapper[4735]: I1008 14:39:32.916271 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:33 crc kubenswrapper[4735]: I1008 14:39:33.401834 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-vphsj"] Oct 08 14:39:33 crc kubenswrapper[4735]: W1008 14:39:33.410537 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb44fabde_cb8a_49d6_baaf_5b77929fa58b.slice/crio-7c8057cf2d94bdc51d3f7e81d0a768f7f33cf9ab7393eb6b7f267c1e583d0f71 WatchSource:0}: Error finding container 7c8057cf2d94bdc51d3f7e81d0a768f7f33cf9ab7393eb6b7f267c1e583d0f71: Status 404 returned error can't find the container with id 7c8057cf2d94bdc51d3f7e81d0a768f7f33cf9ab7393eb6b7f267c1e583d0f71 Oct 08 14:39:34 crc kubenswrapper[4735]: I1008 14:39:34.206991 4735 generic.go:334] "Generic (PLEG): container finished" podID="b44fabde-cb8a-49d6-baaf-5b77929fa58b" containerID="6a5a1704ef569beb35a5e69d76a08ebade4be51449a01a7142a7d19568140b05" exitCode=0 Oct 08 14:39:34 crc kubenswrapper[4735]: I1008 14:39:34.207124 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vphsj" event={"ID":"b44fabde-cb8a-49d6-baaf-5b77929fa58b","Type":"ContainerDied","Data":"6a5a1704ef569beb35a5e69d76a08ebade4be51449a01a7142a7d19568140b05"} Oct 08 14:39:34 crc kubenswrapper[4735]: I1008 14:39:34.207387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vphsj" event={"ID":"b44fabde-cb8a-49d6-baaf-5b77929fa58b","Type":"ContainerStarted","Data":"7c8057cf2d94bdc51d3f7e81d0a768f7f33cf9ab7393eb6b7f267c1e583d0f71"} Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.700826 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.726520 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbdkk\" (UniqueName: \"kubernetes.io/projected/b44fabde-cb8a-49d6-baaf-5b77929fa58b-kube-api-access-tbdkk\") pod \"b44fabde-cb8a-49d6-baaf-5b77929fa58b\" (UID: \"b44fabde-cb8a-49d6-baaf-5b77929fa58b\") " Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.736351 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b44fabde-cb8a-49d6-baaf-5b77929fa58b-kube-api-access-tbdkk" (OuterVolumeSpecName: "kube-api-access-tbdkk") pod "b44fabde-cb8a-49d6-baaf-5b77929fa58b" (UID: "b44fabde-cb8a-49d6-baaf-5b77929fa58b"). InnerVolumeSpecName "kube-api-access-tbdkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.829301 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbdkk\" (UniqueName: \"kubernetes.io/projected/b44fabde-cb8a-49d6-baaf-5b77929fa58b-kube-api-access-tbdkk\") on node \"crc\" DevicePath \"\"" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.846755 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t6kc9"] Oct 08 14:39:35 crc kubenswrapper[4735]: E1008 14:39:35.847312 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44fabde-cb8a-49d6-baaf-5b77929fa58b" containerName="mariadb-database-create" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.847341 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44fabde-cb8a-49d6-baaf-5b77929fa58b" containerName="mariadb-database-create" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.847628 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b44fabde-cb8a-49d6-baaf-5b77929fa58b" containerName="mariadb-database-create" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.849395 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.849953 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6kc9"] Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.931703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-catalog-content\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.931958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-utilities\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:35 crc kubenswrapper[4735]: I1008 14:39:35.932051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2shj\" (UniqueName: \"kubernetes.io/projected/651004c9-c2fe-4e99-b990-3a6e758fbf2e-kube-api-access-s2shj\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.037149 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-catalog-content\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.037754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-utilities\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.037860 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2shj\" (UniqueName: \"kubernetes.io/projected/651004c9-c2fe-4e99-b990-3a6e758fbf2e-kube-api-access-s2shj\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.038171 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-catalog-content\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.038569 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-utilities\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.058741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2shj\" (UniqueName: \"kubernetes.io/projected/651004c9-c2fe-4e99-b990-3a6e758fbf2e-kube-api-access-s2shj\") pod \"redhat-marketplace-t6kc9\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.174604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.235146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vphsj" event={"ID":"b44fabde-cb8a-49d6-baaf-5b77929fa58b","Type":"ContainerDied","Data":"7c8057cf2d94bdc51d3f7e81d0a768f7f33cf9ab7393eb6b7f267c1e583d0f71"} Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.235196 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c8057cf2d94bdc51d3f7e81d0a768f7f33cf9ab7393eb6b7f267c1e583d0f71" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.235268 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vphsj" Oct 08 14:39:36 crc kubenswrapper[4735]: I1008 14:39:36.763602 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6kc9"] Oct 08 14:39:37 crc kubenswrapper[4735]: I1008 14:39:37.035028 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hvjqf"] Oct 08 14:39:37 crc kubenswrapper[4735]: I1008 14:39:37.045597 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hvjqf"] Oct 08 14:39:37 crc kubenswrapper[4735]: I1008 14:39:37.246314 4735 generic.go:334] "Generic (PLEG): container finished" podID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerID="095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44" exitCode=0 Oct 08 14:39:37 crc kubenswrapper[4735]: I1008 14:39:37.246610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6kc9" event={"ID":"651004c9-c2fe-4e99-b990-3a6e758fbf2e","Type":"ContainerDied","Data":"095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44"} Oct 08 14:39:37 crc kubenswrapper[4735]: I1008 14:39:37.246634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6kc9" event={"ID":"651004c9-c2fe-4e99-b990-3a6e758fbf2e","Type":"ContainerStarted","Data":"707946bd9f55035f59ed5eb3d9236a83054d977f500c1c8e1b5160abef8a316e"} Oct 08 14:39:38 crc kubenswrapper[4735]: I1008 14:39:38.028561 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-j9m8t"] Oct 08 14:39:38 crc kubenswrapper[4735]: I1008 14:39:38.037136 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-j9m8t"] Oct 08 14:39:38 crc kubenswrapper[4735]: I1008 14:39:38.907575 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd80f17-b1da-4eaa-a962-01cb94a5abe3" path="/var/lib/kubelet/pods/4cd80f17-b1da-4eaa-a962-01cb94a5abe3/volumes" Oct 08 14:39:38 crc kubenswrapper[4735]: I1008 14:39:38.909818 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6025ac38-ff1c-464a-84a4-75dfabd4bb94" path="/var/lib/kubelet/pods/6025ac38-ff1c-464a-84a4-75dfabd4bb94/volumes" Oct 08 14:39:39 crc kubenswrapper[4735]: I1008 14:39:39.266522 4735 generic.go:334] "Generic (PLEG): container finished" podID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerID="154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651" exitCode=0 Oct 08 14:39:39 crc kubenswrapper[4735]: I1008 14:39:39.266568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6kc9" event={"ID":"651004c9-c2fe-4e99-b990-3a6e758fbf2e","Type":"ContainerDied","Data":"154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651"} Oct 08 14:39:39 crc kubenswrapper[4735]: I1008 14:39:39.894440 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:39:39 crc kubenswrapper[4735]: E1008 14:39:39.894818 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:39:41 crc kubenswrapper[4735]: I1008 14:39:41.285224 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6kc9" event={"ID":"651004c9-c2fe-4e99-b990-3a6e758fbf2e","Type":"ContainerStarted","Data":"b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3"} Oct 08 14:39:41 crc kubenswrapper[4735]: I1008 14:39:41.311663 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t6kc9" podStartSLOduration=3.4752594390000002 podStartE2EDuration="6.311603319s" podCreationTimestamp="2025-10-08 14:39:35 +0000 UTC" firstStartedPulling="2025-10-08 14:39:37.249155947 +0000 UTC m=+6227.057460491" lastFinishedPulling="2025-10-08 14:39:40.085499827 +0000 UTC m=+6229.893804371" observedRunningTime="2025-10-08 14:39:41.305215596 +0000 UTC m=+6231.113520150" watchObservedRunningTime="2025-10-08 14:39:41.311603319 +0000 UTC m=+6231.119907863" Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.618170 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-a871-account-create-cvwbz"] Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.620128 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.626677 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.629187 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a871-account-create-cvwbz"] Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.777160 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtfhl\" (UniqueName: \"kubernetes.io/projected/8a010e60-5c2d-494b-9f68-e98573cff37d-kube-api-access-wtfhl\") pod \"aodh-a871-account-create-cvwbz\" (UID: \"8a010e60-5c2d-494b-9f68-e98573cff37d\") " pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.879278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtfhl\" (UniqueName: \"kubernetes.io/projected/8a010e60-5c2d-494b-9f68-e98573cff37d-kube-api-access-wtfhl\") pod \"aodh-a871-account-create-cvwbz\" (UID: \"8a010e60-5c2d-494b-9f68-e98573cff37d\") " pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.909936 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtfhl\" (UniqueName: \"kubernetes.io/projected/8a010e60-5c2d-494b-9f68-e98573cff37d-kube-api-access-wtfhl\") pod \"aodh-a871-account-create-cvwbz\" (UID: \"8a010e60-5c2d-494b-9f68-e98573cff37d\") " pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:42 crc kubenswrapper[4735]: I1008 14:39:42.941722 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:43 crc kubenswrapper[4735]: I1008 14:39:43.428277 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a871-account-create-cvwbz"] Oct 08 14:39:43 crc kubenswrapper[4735]: W1008 14:39:43.433541 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a010e60_5c2d_494b_9f68_e98573cff37d.slice/crio-7a4d5a08bc124976215e0ef0a453ef11a10378b8692be5026d7340ae400d356d WatchSource:0}: Error finding container 7a4d5a08bc124976215e0ef0a453ef11a10378b8692be5026d7340ae400d356d: Status 404 returned error can't find the container with id 7a4d5a08bc124976215e0ef0a453ef11a10378b8692be5026d7340ae400d356d Oct 08 14:39:44 crc kubenswrapper[4735]: I1008 14:39:44.325195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a871-account-create-cvwbz" event={"ID":"8a010e60-5c2d-494b-9f68-e98573cff37d","Type":"ContainerDied","Data":"23d32286f38099704d0c517e48fb79fcb6ba4cd95ef97a4812f892c94b81d750"} Oct 08 14:39:44 crc kubenswrapper[4735]: I1008 14:39:44.324744 4735 generic.go:334] "Generic (PLEG): container finished" podID="8a010e60-5c2d-494b-9f68-e98573cff37d" containerID="23d32286f38099704d0c517e48fb79fcb6ba4cd95ef97a4812f892c94b81d750" exitCode=0 Oct 08 14:39:44 crc kubenswrapper[4735]: I1008 14:39:44.327806 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a871-account-create-cvwbz" event={"ID":"8a010e60-5c2d-494b-9f68-e98573cff37d","Type":"ContainerStarted","Data":"7a4d5a08bc124976215e0ef0a453ef11a10378b8692be5026d7340ae400d356d"} Oct 08 14:39:45 crc kubenswrapper[4735]: I1008 14:39:45.769337 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:45 crc kubenswrapper[4735]: I1008 14:39:45.939435 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtfhl\" (UniqueName: \"kubernetes.io/projected/8a010e60-5c2d-494b-9f68-e98573cff37d-kube-api-access-wtfhl\") pod \"8a010e60-5c2d-494b-9f68-e98573cff37d\" (UID: \"8a010e60-5c2d-494b-9f68-e98573cff37d\") " Oct 08 14:39:45 crc kubenswrapper[4735]: I1008 14:39:45.946074 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a010e60-5c2d-494b-9f68-e98573cff37d-kube-api-access-wtfhl" (OuterVolumeSpecName: "kube-api-access-wtfhl") pod "8a010e60-5c2d-494b-9f68-e98573cff37d" (UID: "8a010e60-5c2d-494b-9f68-e98573cff37d"). InnerVolumeSpecName "kube-api-access-wtfhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.041955 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtfhl\" (UniqueName: \"kubernetes.io/projected/8a010e60-5c2d-494b-9f68-e98573cff37d-kube-api-access-wtfhl\") on node \"crc\" DevicePath \"\"" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.103700 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.175457 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.175538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.228604 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.345669 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a871-account-create-cvwbz" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.345716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a871-account-create-cvwbz" event={"ID":"8a010e60-5c2d-494b-9f68-e98573cff37d","Type":"ContainerDied","Data":"7a4d5a08bc124976215e0ef0a453ef11a10378b8692be5026d7340ae400d356d"} Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.345742 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a4d5a08bc124976215e0ef0a453ef11a10378b8692be5026d7340ae400d356d" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.392110 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:46 crc kubenswrapper[4735]: I1008 14:39:46.470310 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6kc9"] Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.954427 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-bl2lv"] Oct 08 14:39:47 crc kubenswrapper[4735]: E1008 14:39:47.955447 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a010e60-5c2d-494b-9f68-e98573cff37d" containerName="mariadb-account-create" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.955464 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a010e60-5c2d-494b-9f68-e98573cff37d" containerName="mariadb-account-create" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.955751 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a010e60-5c2d-494b-9f68-e98573cff37d" containerName="mariadb-account-create" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.956754 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.960419 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-dq9mb" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.960690 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.960986 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 08 14:39:47 crc kubenswrapper[4735]: I1008 14:39:47.967145 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bl2lv"] Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.086134 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-combined-ca-bundle\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.086208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-config-data\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.086446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbc46\" (UniqueName: \"kubernetes.io/projected/4399e6dc-523a-4a68-9b4a-aca82d1d861f-kube-api-access-hbc46\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.086487 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-scripts\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.187998 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-scripts\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.188051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbc46\" (UniqueName: \"kubernetes.io/projected/4399e6dc-523a-4a68-9b4a-aca82d1d861f-kube-api-access-hbc46\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.188160 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-combined-ca-bundle\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.188209 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-config-data\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.193627 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-combined-ca-bundle\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.193760 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-config-data\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.193799 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-scripts\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.214058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbc46\" (UniqueName: \"kubernetes.io/projected/4399e6dc-523a-4a68-9b4a-aca82d1d861f-kube-api-access-hbc46\") pod \"aodh-db-sync-bl2lv\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.288667 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.377338 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t6kc9" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="registry-server" containerID="cri-o://b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3" gracePeriod=2 Oct 08 14:39:48 crc kubenswrapper[4735]: I1008 14:39:48.984931 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bl2lv"] Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.114613 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.219177 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-catalog-content\") pod \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.219337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-utilities\") pod \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.219587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2shj\" (UniqueName: \"kubernetes.io/projected/651004c9-c2fe-4e99-b990-3a6e758fbf2e-kube-api-access-s2shj\") pod \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\" (UID: \"651004c9-c2fe-4e99-b990-3a6e758fbf2e\") " Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.221283 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-utilities" (OuterVolumeSpecName: "utilities") pod "651004c9-c2fe-4e99-b990-3a6e758fbf2e" (UID: "651004c9-c2fe-4e99-b990-3a6e758fbf2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.229628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651004c9-c2fe-4e99-b990-3a6e758fbf2e-kube-api-access-s2shj" (OuterVolumeSpecName: "kube-api-access-s2shj") pod "651004c9-c2fe-4e99-b990-3a6e758fbf2e" (UID: "651004c9-c2fe-4e99-b990-3a6e758fbf2e"). InnerVolumeSpecName "kube-api-access-s2shj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.246769 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "651004c9-c2fe-4e99-b990-3a6e758fbf2e" (UID: "651004c9-c2fe-4e99-b990-3a6e758fbf2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.322539 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2shj\" (UniqueName: \"kubernetes.io/projected/651004c9-c2fe-4e99-b990-3a6e758fbf2e-kube-api-access-s2shj\") on node \"crc\" DevicePath \"\"" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.322592 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.323322 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651004c9-c2fe-4e99-b990-3a6e758fbf2e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.388386 4735 generic.go:334] "Generic (PLEG): container finished" podID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerID="b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3" exitCode=0 Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.388452 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6kc9" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.388456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6kc9" event={"ID":"651004c9-c2fe-4e99-b990-3a6e758fbf2e","Type":"ContainerDied","Data":"b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3"} Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.388510 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6kc9" event={"ID":"651004c9-c2fe-4e99-b990-3a6e758fbf2e","Type":"ContainerDied","Data":"707946bd9f55035f59ed5eb3d9236a83054d977f500c1c8e1b5160abef8a316e"} Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.388527 4735 scope.go:117] "RemoveContainer" containerID="b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.393283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bl2lv" event={"ID":"4399e6dc-523a-4a68-9b4a-aca82d1d861f","Type":"ContainerStarted","Data":"5ae431eedc72a26e056571fc9cbca2ce2aabccb8e25e821fe61dfdb18bca9404"} Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.435471 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6kc9"] Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.435742 4735 scope.go:117] "RemoveContainer" containerID="154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.443438 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6kc9"] Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.469238 4735 scope.go:117] "RemoveContainer" containerID="095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.511456 4735 scope.go:117] "RemoveContainer" containerID="b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3" Oct 08 14:39:49 crc kubenswrapper[4735]: E1008 14:39:49.512051 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3\": container with ID starting with b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3 not found: ID does not exist" containerID="b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.512100 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3"} err="failed to get container status \"b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3\": rpc error: code = NotFound desc = could not find container \"b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3\": container with ID starting with b602a47a1c1142d17ce0c66dfa4537e979ac8191ef2db79133b813429185eba3 not found: ID does not exist" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.512131 4735 scope.go:117] "RemoveContainer" containerID="154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651" Oct 08 14:39:49 crc kubenswrapper[4735]: E1008 14:39:49.512532 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651\": container with ID starting with 154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651 not found: ID does not exist" containerID="154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.512565 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651"} err="failed to get container status \"154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651\": rpc error: code = NotFound desc = could not find container \"154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651\": container with ID starting with 154982b3ac679d1ad40996b29422e72136f638cfa7fecbe5552d849400454651 not found: ID does not exist" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.512587 4735 scope.go:117] "RemoveContainer" containerID="095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44" Oct 08 14:39:49 crc kubenswrapper[4735]: E1008 14:39:49.512994 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44\": container with ID starting with 095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44 not found: ID does not exist" containerID="095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44" Oct 08 14:39:49 crc kubenswrapper[4735]: I1008 14:39:49.513027 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44"} err="failed to get container status \"095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44\": rpc error: code = NotFound desc = could not find container \"095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44\": container with ID starting with 095517a17b48f8c97570959ed2b2fa8fdf66ad8848ed142184c521a5d2b3dd44 not found: ID does not exist" Oct 08 14:39:50 crc kubenswrapper[4735]: I1008 14:39:50.903688 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:39:50 crc kubenswrapper[4735]: E1008 14:39:50.904058 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:39:50 crc kubenswrapper[4735]: I1008 14:39:50.913851 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" path="/var/lib/kubelet/pods/651004c9-c2fe-4e99-b990-3a6e758fbf2e/volumes" Oct 08 14:39:52 crc kubenswrapper[4735]: I1008 14:39:52.056667 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pn8vf"] Oct 08 14:39:52 crc kubenswrapper[4735]: I1008 14:39:52.065949 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pn8vf"] Oct 08 14:39:52 crc kubenswrapper[4735]: I1008 14:39:52.905657 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5112d677-fdd4-4ea0-8b93-fa7871c95519" path="/var/lib/kubelet/pods/5112d677-fdd4-4ea0-8b93-fa7871c95519/volumes" Oct 08 14:39:56 crc kubenswrapper[4735]: I1008 14:39:56.469632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bl2lv" event={"ID":"4399e6dc-523a-4a68-9b4a-aca82d1d861f","Type":"ContainerStarted","Data":"d93cd0769eb93f9da7731dfefb9f8478b3527e196d42e5b8f373014c82a24abf"} Oct 08 14:39:56 crc kubenswrapper[4735]: I1008 14:39:56.497635 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-bl2lv" podStartSLOduration=2.968689495 podStartE2EDuration="9.497606536s" podCreationTimestamp="2025-10-08 14:39:47 +0000 UTC" firstStartedPulling="2025-10-08 14:39:48.995068281 +0000 UTC m=+6238.803372825" lastFinishedPulling="2025-10-08 14:39:55.523985322 +0000 UTC m=+6245.332289866" observedRunningTime="2025-10-08 14:39:56.486555313 +0000 UTC m=+6246.294859857" watchObservedRunningTime="2025-10-08 14:39:56.497606536 +0000 UTC m=+6246.305911080" Oct 08 14:39:58 crc kubenswrapper[4735]: I1008 14:39:58.493417 4735 generic.go:334] "Generic (PLEG): container finished" podID="4399e6dc-523a-4a68-9b4a-aca82d1d861f" containerID="d93cd0769eb93f9da7731dfefb9f8478b3527e196d42e5b8f373014c82a24abf" exitCode=0 Oct 08 14:39:58 crc kubenswrapper[4735]: I1008 14:39:58.493506 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bl2lv" event={"ID":"4399e6dc-523a-4a68-9b4a-aca82d1d861f","Type":"ContainerDied","Data":"d93cd0769eb93f9da7731dfefb9f8478b3527e196d42e5b8f373014c82a24abf"} Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.918382 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.967087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-config-data\") pod \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.967134 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbc46\" (UniqueName: \"kubernetes.io/projected/4399e6dc-523a-4a68-9b4a-aca82d1d861f-kube-api-access-hbc46\") pod \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.967157 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-combined-ca-bundle\") pod \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.967183 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-scripts\") pod \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\" (UID: \"4399e6dc-523a-4a68-9b4a-aca82d1d861f\") " Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.974714 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-scripts" (OuterVolumeSpecName: "scripts") pod "4399e6dc-523a-4a68-9b4a-aca82d1d861f" (UID: "4399e6dc-523a-4a68-9b4a-aca82d1d861f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.986144 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4399e6dc-523a-4a68-9b4a-aca82d1d861f-kube-api-access-hbc46" (OuterVolumeSpecName: "kube-api-access-hbc46") pod "4399e6dc-523a-4a68-9b4a-aca82d1d861f" (UID: "4399e6dc-523a-4a68-9b4a-aca82d1d861f"). InnerVolumeSpecName "kube-api-access-hbc46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:39:59 crc kubenswrapper[4735]: I1008 14:39:59.999393 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4399e6dc-523a-4a68-9b4a-aca82d1d861f" (UID: "4399e6dc-523a-4a68-9b4a-aca82d1d861f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.003723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-config-data" (OuterVolumeSpecName: "config-data") pod "4399e6dc-523a-4a68-9b4a-aca82d1d861f" (UID: "4399e6dc-523a-4a68-9b4a-aca82d1d861f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.071015 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.071058 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbc46\" (UniqueName: \"kubernetes.io/projected/4399e6dc-523a-4a68-9b4a-aca82d1d861f-kube-api-access-hbc46\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.071073 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.071084 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4399e6dc-523a-4a68-9b4a-aca82d1d861f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.522874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bl2lv" event={"ID":"4399e6dc-523a-4a68-9b4a-aca82d1d861f","Type":"ContainerDied","Data":"5ae431eedc72a26e056571fc9cbca2ce2aabccb8e25e821fe61dfdb18bca9404"} Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.522947 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae431eedc72a26e056571fc9cbca2ce2aabccb8e25e821fe61dfdb18bca9404" Oct 08 14:40:00 crc kubenswrapper[4735]: I1008 14:40:00.523119 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bl2lv" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.585516 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 08 14:40:02 crc kubenswrapper[4735]: E1008 14:40:02.586396 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="extract-utilities" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.586415 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="extract-utilities" Oct 08 14:40:02 crc kubenswrapper[4735]: E1008 14:40:02.586427 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4399e6dc-523a-4a68-9b4a-aca82d1d861f" containerName="aodh-db-sync" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.586435 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4399e6dc-523a-4a68-9b4a-aca82d1d861f" containerName="aodh-db-sync" Oct 08 14:40:02 crc kubenswrapper[4735]: E1008 14:40:02.586470 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="extract-content" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.586495 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="extract-content" Oct 08 14:40:02 crc kubenswrapper[4735]: E1008 14:40:02.586508 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="registry-server" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.586515 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="registry-server" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.586748 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="651004c9-c2fe-4e99-b990-3a6e758fbf2e" containerName="registry-server" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.586770 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4399e6dc-523a-4a68-9b4a-aca82d1d861f" containerName="aodh-db-sync" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.594699 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.600378 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-dq9mb" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.600634 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.600724 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.612426 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.628027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-config-data\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.628117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8d8w\" (UniqueName: \"kubernetes.io/projected/5fc9ad16-4a44-4b09-8622-50ab55a4430c-kube-api-access-x8d8w\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.628234 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-scripts\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.628378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.730536 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8d8w\" (UniqueName: \"kubernetes.io/projected/5fc9ad16-4a44-4b09-8622-50ab55a4430c-kube-api-access-x8d8w\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.730682 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-scripts\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.730821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.730903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-config-data\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.739876 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.740308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-config-data\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.743312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fc9ad16-4a44-4b09-8622-50ab55a4430c-scripts\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.754661 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8d8w\" (UniqueName: \"kubernetes.io/projected/5fc9ad16-4a44-4b09-8622-50ab55a4430c-kube-api-access-x8d8w\") pod \"aodh-0\" (UID: \"5fc9ad16-4a44-4b09-8622-50ab55a4430c\") " pod="openstack/aodh-0" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.894935 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:40:02 crc kubenswrapper[4735]: E1008 14:40:02.895231 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:40:02 crc kubenswrapper[4735]: I1008 14:40:02.921866 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 08 14:40:03 crc kubenswrapper[4735]: I1008 14:40:03.462952 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 08 14:40:03 crc kubenswrapper[4735]: I1008 14:40:03.561259 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5fc9ad16-4a44-4b09-8622-50ab55a4430c","Type":"ContainerStarted","Data":"281cc19dbad6b08c524d1ad47411f273c4655303577dcff2e332a5ae11aa7247"} Oct 08 14:40:04 crc kubenswrapper[4735]: I1008 14:40:04.570855 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5fc9ad16-4a44-4b09-8622-50ab55a4430c","Type":"ContainerStarted","Data":"4bb94a58b1166025b03ec6d76558b5112bab0bb0548e8f37870e1e6e66c3fd04"} Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.195811 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.196090 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-central-agent" containerID="cri-o://f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe" gracePeriod=30 Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.196204 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="proxy-httpd" containerID="cri-o://d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5" gracePeriod=30 Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.196240 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="sg-core" containerID="cri-o://1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f" gracePeriod=30 Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.196274 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-notification-agent" containerID="cri-o://677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca" gracePeriod=30 Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.588061 4735 generic.go:334] "Generic (PLEG): container finished" podID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerID="d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5" exitCode=0 Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.588409 4735 generic.go:334] "Generic (PLEG): container finished" podID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerID="1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f" exitCode=2 Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.588156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerDied","Data":"d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5"} Oct 08 14:40:05 crc kubenswrapper[4735]: I1008 14:40:05.588458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerDied","Data":"1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f"} Oct 08 14:40:06 crc kubenswrapper[4735]: I1008 14:40:06.603113 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5fc9ad16-4a44-4b09-8622-50ab55a4430c","Type":"ContainerStarted","Data":"c92ed73bf7ea7a9f98634254e782d24b0ba993b0c9733b3cf0e2309bb0541322"} Oct 08 14:40:06 crc kubenswrapper[4735]: I1008 14:40:06.609116 4735 generic.go:334] "Generic (PLEG): container finished" podID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerID="f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe" exitCode=0 Oct 08 14:40:06 crc kubenswrapper[4735]: I1008 14:40:06.609167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerDied","Data":"f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe"} Oct 08 14:40:08 crc kubenswrapper[4735]: I1008 14:40:08.631659 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5fc9ad16-4a44-4b09-8622-50ab55a4430c","Type":"ContainerStarted","Data":"6feb636378dc6adf7da5c88e4386eccc4eae67bf9957cc242386956e529b28e6"} Oct 08 14:40:09 crc kubenswrapper[4735]: I1008 14:40:09.645550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5fc9ad16-4a44-4b09-8622-50ab55a4430c","Type":"ContainerStarted","Data":"66932588c4385eeef793cc8b78b77cac5769e6f4b8919d0afa9c68647f219612"} Oct 08 14:40:09 crc kubenswrapper[4735]: I1008 14:40:09.687849 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.026279219 podStartE2EDuration="7.687823624s" podCreationTimestamp="2025-10-08 14:40:02 +0000 UTC" firstStartedPulling="2025-10-08 14:40:03.474036693 +0000 UTC m=+6253.282341237" lastFinishedPulling="2025-10-08 14:40:09.135581098 +0000 UTC m=+6258.943885642" observedRunningTime="2025-10-08 14:40:09.669597048 +0000 UTC m=+6259.477901592" watchObservedRunningTime="2025-10-08 14:40:09.687823624 +0000 UTC m=+6259.496128168" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.657143 4735 generic.go:334] "Generic (PLEG): container finished" podID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerID="677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca" exitCode=0 Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.657520 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.657338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerDied","Data":"677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca"} Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.657602 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe43c7c-e920-40b6-8625-704c0ac19dbd","Type":"ContainerDied","Data":"57fae84403d6c73b42113ca535874923abc8341e3c9b61838a281348911b8d6f"} Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.657626 4735 scope.go:117] "RemoveContainer" containerID="d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.702591 4735 scope.go:117] "RemoveContainer" containerID="1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.724697 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-config-data\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.724853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-combined-ca-bundle\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.724948 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-log-httpd\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.725038 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjj99\" (UniqueName: \"kubernetes.io/projected/7fe43c7c-e920-40b6-8625-704c0ac19dbd-kube-api-access-fjj99\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.725157 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-scripts\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.725196 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-sg-core-conf-yaml\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.725236 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-run-httpd\") pod \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\" (UID: \"7fe43c7c-e920-40b6-8625-704c0ac19dbd\") " Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.727165 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.729935 4735 scope.go:117] "RemoveContainer" containerID="677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.733681 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.733914 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-scripts" (OuterVolumeSpecName: "scripts") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.737434 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.737627 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.737655 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe43c7c-e920-40b6-8625-704c0ac19dbd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.743020 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe43c7c-e920-40b6-8625-704c0ac19dbd-kube-api-access-fjj99" (OuterVolumeSpecName: "kube-api-access-fjj99") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "kube-api-access-fjj99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.827622 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.841330 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.841398 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjj99\" (UniqueName: \"kubernetes.io/projected/7fe43c7c-e920-40b6-8625-704c0ac19dbd-kube-api-access-fjj99\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.843064 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.855963 4735 scope.go:117] "RemoveContainer" containerID="f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.882901 4735 scope.go:117] "RemoveContainer" containerID="d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5" Oct 08 14:40:10 crc kubenswrapper[4735]: E1008 14:40:10.885008 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5\": container with ID starting with d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5 not found: ID does not exist" containerID="d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.885049 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5"} err="failed to get container status \"d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5\": rpc error: code = NotFound desc = could not find container \"d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5\": container with ID starting with d70448abd2cc52b87301a20ab7c209315b8aee6ce1dbf3df6d9b23a7dd3361a5 not found: ID does not exist" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.885080 4735 scope.go:117] "RemoveContainer" containerID="1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f" Oct 08 14:40:10 crc kubenswrapper[4735]: E1008 14:40:10.885397 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f\": container with ID starting with 1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f not found: ID does not exist" containerID="1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.885445 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f"} err="failed to get container status \"1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f\": rpc error: code = NotFound desc = could not find container \"1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f\": container with ID starting with 1820afc7047213c7c306d11787e50a571487312bea2e1ad7a5a99fa70d824c1f not found: ID does not exist" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.885471 4735 scope.go:117] "RemoveContainer" containerID="677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca" Oct 08 14:40:10 crc kubenswrapper[4735]: E1008 14:40:10.886921 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca\": container with ID starting with 677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca not found: ID does not exist" containerID="677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.886943 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca"} err="failed to get container status \"677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca\": rpc error: code = NotFound desc = could not find container \"677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca\": container with ID starting with 677073164fcd9b51e84e80d50248ad694f31e7db29215167d30b67d8d2beb0ca not found: ID does not exist" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.886960 4735 scope.go:117] "RemoveContainer" containerID="f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe" Oct 08 14:40:10 crc kubenswrapper[4735]: E1008 14:40:10.887173 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe\": container with ID starting with f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe not found: ID does not exist" containerID="f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.887195 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe"} err="failed to get container status \"f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe\": rpc error: code = NotFound desc = could not find container \"f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe\": container with ID starting with f4f7ed581c1aa81dd35aaff2b7de8465fd629ed3773af06e3087ecf12076affe not found: ID does not exist" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.903636 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-config-data" (OuterVolumeSpecName: "config-data") pod "7fe43c7c-e920-40b6-8625-704c0ac19dbd" (UID: "7fe43c7c-e920-40b6-8625-704c0ac19dbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.942813 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: I1008 14:40:10.942848 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe43c7c-e920-40b6-8625-704c0ac19dbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:10 crc kubenswrapper[4735]: E1008 14:40:10.976353 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fe43c7c_e920_40b6_8625_704c0ac19dbd.slice\": RecentStats: unable to find data in memory cache]" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.330294 4735 scope.go:117] "RemoveContainer" containerID="55ef97d3905a3568ca8237e879eca8c1eca6f098eeefb397bdbe6c5cc8a29e56" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.377423 4735 scope.go:117] "RemoveContainer" containerID="7578f4de0aee4f8f1b7587793b16ae16e3d132fb2dda07dbfe782d0bef4bc7d3" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.434398 4735 scope.go:117] "RemoveContainer" containerID="f893b3a8762fb3a5354e9b9d2f77f696af7198c68561e8bc69d1aacafb8571e0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.549964 4735 scope.go:117] "RemoveContainer" containerID="d6d14aa186181136687c60f46a9cce968c91550db2b7d4f2f3fc62a96587f37d" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.666160 4735 scope.go:117] "RemoveContainer" containerID="c2e7af3ee59425c649664e213c8bbc570af803ac6017912034e3330595461a1c" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.672547 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.703407 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.730339 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742026 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:11 crc kubenswrapper[4735]: E1008 14:40:11.742543 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="sg-core" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742564 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="sg-core" Oct 08 14:40:11 crc kubenswrapper[4735]: E1008 14:40:11.742580 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-notification-agent" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742587 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-notification-agent" Oct 08 14:40:11 crc kubenswrapper[4735]: E1008 14:40:11.742612 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-central-agent" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742619 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-central-agent" Oct 08 14:40:11 crc kubenswrapper[4735]: E1008 14:40:11.742635 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="proxy-httpd" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742641 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="proxy-httpd" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742836 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="sg-core" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742854 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="proxy-httpd" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742870 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-central-agent" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.742877 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" containerName="ceilometer-notification-agent" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.745381 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.748263 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.752012 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.755302 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.860751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-run-httpd\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.860807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-scripts\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.860851 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.860915 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.861005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snnf2\" (UniqueName: \"kubernetes.io/projected/07c48feb-11be-4464-9082-b8358ffe86a2-kube-api-access-snnf2\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.861045 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-config-data\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.861107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-log-httpd\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.963635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.963924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.964139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snnf2\" (UniqueName: \"kubernetes.io/projected/07c48feb-11be-4464-9082-b8358ffe86a2-kube-api-access-snnf2\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.964321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-config-data\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.964548 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-log-httpd\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.965028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-run-httpd\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.965144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-scripts\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.965542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-log-httpd\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.965839 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-run-httpd\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.969197 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.969806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-scripts\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.971234 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-config-data\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.972622 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:11 crc kubenswrapper[4735]: I1008 14:40:11.986232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snnf2\" (UniqueName: \"kubernetes.io/projected/07c48feb-11be-4464-9082-b8358ffe86a2-kube-api-access-snnf2\") pod \"ceilometer-0\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " pod="openstack/ceilometer-0" Oct 08 14:40:12 crc kubenswrapper[4735]: I1008 14:40:12.072120 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:40:12 crc kubenswrapper[4735]: I1008 14:40:12.723761 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:12 crc kubenswrapper[4735]: I1008 14:40:12.907076 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe43c7c-e920-40b6-8625-704c0ac19dbd" path="/var/lib/kubelet/pods/7fe43c7c-e920-40b6-8625-704c0ac19dbd/volumes" Oct 08 14:40:13 crc kubenswrapper[4735]: I1008 14:40:13.693873 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerStarted","Data":"1396a8349decae037cca9020394cda6d20fd37248e07e7b2f9995aaaeb6e2124"} Oct 08 14:40:13 crc kubenswrapper[4735]: I1008 14:40:13.694497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerStarted","Data":"f9f81589875c153269d393a7047acd254031e21337b8c460658f4cea1c5ee039"} Oct 08 14:40:13 crc kubenswrapper[4735]: I1008 14:40:13.894643 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:40:13 crc kubenswrapper[4735]: E1008 14:40:13.894958 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:40:14 crc kubenswrapper[4735]: I1008 14:40:14.708584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerStarted","Data":"ffa082810632475e36da4c1a417cea8600e3322cc675f9d54e63ce4cf7e2a320"} Oct 08 14:40:15 crc kubenswrapper[4735]: I1008 14:40:15.721550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerStarted","Data":"a0b83df9c69eb9d2b39ea303b2935a8381ff2949f30e1ef2c051fb0d1dcac46c"} Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.335186 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-7c5xp"] Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.337194 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.351954 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-7c5xp"] Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.354799 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t2nr\" (UniqueName: \"kubernetes.io/projected/e3750677-6a05-476a-8bc1-f5f0ff3cb574-kube-api-access-6t2nr\") pod \"manila-db-create-7c5xp\" (UID: \"e3750677-6a05-476a-8bc1-f5f0ff3cb574\") " pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.458485 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t2nr\" (UniqueName: \"kubernetes.io/projected/e3750677-6a05-476a-8bc1-f5f0ff3cb574-kube-api-access-6t2nr\") pod \"manila-db-create-7c5xp\" (UID: \"e3750677-6a05-476a-8bc1-f5f0ff3cb574\") " pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.483472 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t2nr\" (UniqueName: \"kubernetes.io/projected/e3750677-6a05-476a-8bc1-f5f0ff3cb574-kube-api-access-6t2nr\") pod \"manila-db-create-7c5xp\" (UID: \"e3750677-6a05-476a-8bc1-f5f0ff3cb574\") " pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.600418 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.740671 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerStarted","Data":"8e301d7027ea33447bb369464343aaad54ebe01e3763ee4bd73eeef5b69ad7ea"} Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.740939 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 14:40:16 crc kubenswrapper[4735]: I1008 14:40:16.775137 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.174777436 podStartE2EDuration="5.775112538s" podCreationTimestamp="2025-10-08 14:40:11 +0000 UTC" firstStartedPulling="2025-10-08 14:40:12.730702547 +0000 UTC m=+6262.539007091" lastFinishedPulling="2025-10-08 14:40:16.331037649 +0000 UTC m=+6266.139342193" observedRunningTime="2025-10-08 14:40:16.767189485 +0000 UTC m=+6266.575494049" watchObservedRunningTime="2025-10-08 14:40:16.775112538 +0000 UTC m=+6266.583417082" Oct 08 14:40:17 crc kubenswrapper[4735]: I1008 14:40:17.158144 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-7c5xp"] Oct 08 14:40:17 crc kubenswrapper[4735]: W1008 14:40:17.161984 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3750677_6a05_476a_8bc1_f5f0ff3cb574.slice/crio-482c53c0445428e3ed495f71734a50d9aea24f69c0e5ad253ebaee42a601ffd7 WatchSource:0}: Error finding container 482c53c0445428e3ed495f71734a50d9aea24f69c0e5ad253ebaee42a601ffd7: Status 404 returned error can't find the container with id 482c53c0445428e3ed495f71734a50d9aea24f69c0e5ad253ebaee42a601ffd7 Oct 08 14:40:17 crc kubenswrapper[4735]: I1008 14:40:17.757351 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3750677-6a05-476a-8bc1-f5f0ff3cb574" containerID="ec73cb8e9a729d6ee83f9b366ab3c979db7291587c27a1e60caa46d81d77d796" exitCode=0 Oct 08 14:40:17 crc kubenswrapper[4735]: I1008 14:40:17.759310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7c5xp" event={"ID":"e3750677-6a05-476a-8bc1-f5f0ff3cb574","Type":"ContainerDied","Data":"ec73cb8e9a729d6ee83f9b366ab3c979db7291587c27a1e60caa46d81d77d796"} Oct 08 14:40:17 crc kubenswrapper[4735]: I1008 14:40:17.759342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7c5xp" event={"ID":"e3750677-6a05-476a-8bc1-f5f0ff3cb574","Type":"ContainerStarted","Data":"482c53c0445428e3ed495f71734a50d9aea24f69c0e5ad253ebaee42a601ffd7"} Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.257814 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.425848 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t2nr\" (UniqueName: \"kubernetes.io/projected/e3750677-6a05-476a-8bc1-f5f0ff3cb574-kube-api-access-6t2nr\") pod \"e3750677-6a05-476a-8bc1-f5f0ff3cb574\" (UID: \"e3750677-6a05-476a-8bc1-f5f0ff3cb574\") " Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.432111 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3750677-6a05-476a-8bc1-f5f0ff3cb574-kube-api-access-6t2nr" (OuterVolumeSpecName: "kube-api-access-6t2nr") pod "e3750677-6a05-476a-8bc1-f5f0ff3cb574" (UID: "e3750677-6a05-476a-8bc1-f5f0ff3cb574"). InnerVolumeSpecName "kube-api-access-6t2nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.528220 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t2nr\" (UniqueName: \"kubernetes.io/projected/e3750677-6a05-476a-8bc1-f5f0ff3cb574-kube-api-access-6t2nr\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.777628 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7c5xp" event={"ID":"e3750677-6a05-476a-8bc1-f5f0ff3cb574","Type":"ContainerDied","Data":"482c53c0445428e3ed495f71734a50d9aea24f69c0e5ad253ebaee42a601ffd7"} Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.777683 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="482c53c0445428e3ed495f71734a50d9aea24f69c0e5ad253ebaee42a601ffd7" Oct 08 14:40:19 crc kubenswrapper[4735]: I1008 14:40:19.777755 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7c5xp" Oct 08 14:40:24 crc kubenswrapper[4735]: I1008 14:40:24.894991 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:40:24 crc kubenswrapper[4735]: E1008 14:40:24.895892 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.476680 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-7cbf-account-create-kmczx"] Oct 08 14:40:26 crc kubenswrapper[4735]: E1008 14:40:26.477673 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3750677-6a05-476a-8bc1-f5f0ff3cb574" containerName="mariadb-database-create" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.477694 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3750677-6a05-476a-8bc1-f5f0ff3cb574" containerName="mariadb-database-create" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.478012 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3750677-6a05-476a-8bc1-f5f0ff3cb574" containerName="mariadb-database-create" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.479223 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.481858 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.506531 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-7cbf-account-create-kmczx"] Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.571929 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjvq8\" (UniqueName: \"kubernetes.io/projected/707b9e23-4080-451d-818f-733ecfb23ec2-kube-api-access-rjvq8\") pod \"manila-7cbf-account-create-kmczx\" (UID: \"707b9e23-4080-451d-818f-733ecfb23ec2\") " pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.673660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjvq8\" (UniqueName: \"kubernetes.io/projected/707b9e23-4080-451d-818f-733ecfb23ec2-kube-api-access-rjvq8\") pod \"manila-7cbf-account-create-kmczx\" (UID: \"707b9e23-4080-451d-818f-733ecfb23ec2\") " pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.694790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjvq8\" (UniqueName: \"kubernetes.io/projected/707b9e23-4080-451d-818f-733ecfb23ec2-kube-api-access-rjvq8\") pod \"manila-7cbf-account-create-kmczx\" (UID: \"707b9e23-4080-451d-818f-733ecfb23ec2\") " pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:26 crc kubenswrapper[4735]: I1008 14:40:26.802470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:27 crc kubenswrapper[4735]: I1008 14:40:27.272241 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-7cbf-account-create-kmczx"] Oct 08 14:40:27 crc kubenswrapper[4735]: I1008 14:40:27.857960 4735 generic.go:334] "Generic (PLEG): container finished" podID="707b9e23-4080-451d-818f-733ecfb23ec2" containerID="aa7bb50eb227d04309efc533ca2a2ec2f81b7db2faa2bbc3c9530210fcebb929" exitCode=0 Oct 08 14:40:27 crc kubenswrapper[4735]: I1008 14:40:27.858027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7cbf-account-create-kmczx" event={"ID":"707b9e23-4080-451d-818f-733ecfb23ec2","Type":"ContainerDied","Data":"aa7bb50eb227d04309efc533ca2a2ec2f81b7db2faa2bbc3c9530210fcebb929"} Oct 08 14:40:27 crc kubenswrapper[4735]: I1008 14:40:27.858315 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7cbf-account-create-kmczx" event={"ID":"707b9e23-4080-451d-818f-733ecfb23ec2","Type":"ContainerStarted","Data":"1bfa779996e146facf6f78cf11fa3e73ee9c65b0942ff13b37811379bb91a265"} Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.278984 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.440243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjvq8\" (UniqueName: \"kubernetes.io/projected/707b9e23-4080-451d-818f-733ecfb23ec2-kube-api-access-rjvq8\") pod \"707b9e23-4080-451d-818f-733ecfb23ec2\" (UID: \"707b9e23-4080-451d-818f-733ecfb23ec2\") " Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.449812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707b9e23-4080-451d-818f-733ecfb23ec2-kube-api-access-rjvq8" (OuterVolumeSpecName: "kube-api-access-rjvq8") pod "707b9e23-4080-451d-818f-733ecfb23ec2" (UID: "707b9e23-4080-451d-818f-733ecfb23ec2"). InnerVolumeSpecName "kube-api-access-rjvq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.543758 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjvq8\" (UniqueName: \"kubernetes.io/projected/707b9e23-4080-451d-818f-733ecfb23ec2-kube-api-access-rjvq8\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.879605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7cbf-account-create-kmczx" event={"ID":"707b9e23-4080-451d-818f-733ecfb23ec2","Type":"ContainerDied","Data":"1bfa779996e146facf6f78cf11fa3e73ee9c65b0942ff13b37811379bb91a265"} Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.879649 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bfa779996e146facf6f78cf11fa3e73ee9c65b0942ff13b37811379bb91a265" Oct 08 14:40:29 crc kubenswrapper[4735]: I1008 14:40:29.879651 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7cbf-account-create-kmczx" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.738046 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-br9l6"] Oct 08 14:40:31 crc kubenswrapper[4735]: E1008 14:40:31.738717 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707b9e23-4080-451d-818f-733ecfb23ec2" containerName="mariadb-account-create" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.738732 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="707b9e23-4080-451d-818f-733ecfb23ec2" containerName="mariadb-account-create" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.738991 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="707b9e23-4080-451d-818f-733ecfb23ec2" containerName="mariadb-account-create" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.739663 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.746303 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-j9sdz" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.747157 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.757271 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-br9l6"] Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.896446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk8rc\" (UniqueName: \"kubernetes.io/projected/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-kube-api-access-nk8rc\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.896562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-config-data\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.896697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-combined-ca-bundle\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:31 crc kubenswrapper[4735]: I1008 14:40:31.897067 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-job-config-data\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.000773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-job-config-data\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.000960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk8rc\" (UniqueName: \"kubernetes.io/projected/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-kube-api-access-nk8rc\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.001009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-config-data\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.001041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-combined-ca-bundle\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.009296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-job-config-data\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.023629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-config-data\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.024545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-combined-ca-bundle\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.033497 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk8rc\" (UniqueName: \"kubernetes.io/projected/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-kube-api-access-nk8rc\") pod \"manila-db-sync-br9l6\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.068383 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.851405 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-br9l6"] Oct 08 14:40:32 crc kubenswrapper[4735]: I1008 14:40:32.905954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-br9l6" event={"ID":"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8","Type":"ContainerStarted","Data":"9762dbe35b529b674380e1ed68fceb1f60f974b3e1fab823e1a95060b6ec1ec9"} Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.713752 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7z8p"] Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.718412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.737975 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7z8p"] Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.859235 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-utilities\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.859565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-catalog-content\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.859586 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdwhk\" (UniqueName: \"kubernetes.io/projected/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-kube-api-access-fdwhk\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.962141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-utilities\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.962203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-catalog-content\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.962235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdwhk\" (UniqueName: \"kubernetes.io/projected/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-kube-api-access-fdwhk\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.963009 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-catalog-content\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.963437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-utilities\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:34 crc kubenswrapper[4735]: I1008 14:40:34.988623 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdwhk\" (UniqueName: \"kubernetes.io/projected/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-kube-api-access-fdwhk\") pod \"redhat-operators-g7z8p\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:35 crc kubenswrapper[4735]: I1008 14:40:35.042279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:35 crc kubenswrapper[4735]: I1008 14:40:35.895453 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:40:35 crc kubenswrapper[4735]: E1008 14:40:35.896070 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:40:36 crc kubenswrapper[4735]: I1008 14:40:36.044419 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ww4kk"] Oct 08 14:40:36 crc kubenswrapper[4735]: I1008 14:40:36.057282 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ww4kk"] Oct 08 14:40:36 crc kubenswrapper[4735]: I1008 14:40:36.908421 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bccda4-2090-4e33-9fc4-ff1f852c6fef" path="/var/lib/kubelet/pods/56bccda4-2090-4e33-9fc4-ff1f852c6fef/volumes" Oct 08 14:40:37 crc kubenswrapper[4735]: I1008 14:40:37.758576 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7z8p"] Oct 08 14:40:37 crc kubenswrapper[4735]: W1008 14:40:37.762810 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0905e0e_781a_4cd4_b0c8_06354fb4ef52.slice/crio-466d28bfb3b3379670ad47af7fc773878cc8c5cdfcfadbb177bf29c98c44d0d2 WatchSource:0}: Error finding container 466d28bfb3b3379670ad47af7fc773878cc8c5cdfcfadbb177bf29c98c44d0d2: Status 404 returned error can't find the container with id 466d28bfb3b3379670ad47af7fc773878cc8c5cdfcfadbb177bf29c98c44d0d2 Oct 08 14:40:37 crc kubenswrapper[4735]: I1008 14:40:37.955835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerStarted","Data":"466d28bfb3b3379670ad47af7fc773878cc8c5cdfcfadbb177bf29c98c44d0d2"} Oct 08 14:40:38 crc kubenswrapper[4735]: I1008 14:40:38.965886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-br9l6" event={"ID":"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8","Type":"ContainerStarted","Data":"6505f6c8de497edeb92e46f3deb531f3c388425dac5c431b084962c1bc6c3950"} Oct 08 14:40:38 crc kubenswrapper[4735]: I1008 14:40:38.969207 4735 generic.go:334] "Generic (PLEG): container finished" podID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerID="2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e" exitCode=0 Oct 08 14:40:38 crc kubenswrapper[4735]: I1008 14:40:38.969242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerDied","Data":"2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e"} Oct 08 14:40:38 crc kubenswrapper[4735]: I1008 14:40:38.990381 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-br9l6" podStartSLOduration=3.486213763 podStartE2EDuration="7.990363894s" podCreationTimestamp="2025-10-08 14:40:31 +0000 UTC" firstStartedPulling="2025-10-08 14:40:32.864439611 +0000 UTC m=+6282.672744155" lastFinishedPulling="2025-10-08 14:40:37.368589742 +0000 UTC m=+6287.176894286" observedRunningTime="2025-10-08 14:40:38.98473316 +0000 UTC m=+6288.793037714" watchObservedRunningTime="2025-10-08 14:40:38.990363894 +0000 UTC m=+6288.798668438" Oct 08 14:40:39 crc kubenswrapper[4735]: I1008 14:40:39.982991 4735 generic.go:334] "Generic (PLEG): container finished" podID="c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" containerID="6505f6c8de497edeb92e46f3deb531f3c388425dac5c431b084962c1bc6c3950" exitCode=0 Oct 08 14:40:39 crc kubenswrapper[4735]: I1008 14:40:39.983288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-br9l6" event={"ID":"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8","Type":"ContainerDied","Data":"6505f6c8de497edeb92e46f3deb531f3c388425dac5c431b084962c1bc6c3950"} Oct 08 14:40:41 crc kubenswrapper[4735]: I1008 14:40:40.999764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerStarted","Data":"71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c"} Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.077205 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.861731 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.956223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-config-data\") pod \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.956304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk8rc\" (UniqueName: \"kubernetes.io/projected/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-kube-api-access-nk8rc\") pod \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.956384 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-job-config-data\") pod \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.956414 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-combined-ca-bundle\") pod \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\" (UID: \"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8\") " Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.963170 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-kube-api-access-nk8rc" (OuterVolumeSpecName: "kube-api-access-nk8rc") pod "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" (UID: "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8"). InnerVolumeSpecName "kube-api-access-nk8rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.963537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" (UID: "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.982167 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-config-data" (OuterVolumeSpecName: "config-data") pod "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" (UID: "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:42 crc kubenswrapper[4735]: I1008 14:40:42.988720 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" (UID: "c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.022698 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-br9l6" event={"ID":"c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8","Type":"ContainerDied","Data":"9762dbe35b529b674380e1ed68fceb1f60f974b3e1fab823e1a95060b6ec1ec9"} Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.022739 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9762dbe35b529b674380e1ed68fceb1f60f974b3e1fab823e1a95060b6ec1ec9" Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.022802 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-br9l6" Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.060114 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.060191 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk8rc\" (UniqueName: \"kubernetes.io/projected/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-kube-api-access-nk8rc\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.060205 4735 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:43 crc kubenswrapper[4735]: I1008 14:40:43.060217 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.241724 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 08 14:40:44 crc kubenswrapper[4735]: E1008 14:40:44.242469 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" containerName="manila-db-sync" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.242498 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" containerName="manila-db-sync" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.242768 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" containerName="manila-db-sync" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.360931 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.361071 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.369346 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.369766 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.378977 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.382091 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b4fbdf9f-ff9q7"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.386364 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-j9sdz" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.398010 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.436342 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.438764 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.456781 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.475374 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4fbdf9f-ff9q7"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497198 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8fhn\" (UniqueName: \"kubernetes.io/projected/53d57537-1158-4e9f-bf5f-bfa4976bdd83-kube-api-access-w8fhn\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497245 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497263 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497283 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb6mj\" (UniqueName: \"kubernetes.io/projected/8fdd3521-1c60-4f39-8274-132b679755c8-kube-api-access-hb6mj\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497314 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-config-data\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497358 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-dns-svc\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497380 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497404 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-config\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497433 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8fdd3521-1c60-4f39-8274-132b679755c8-ceph\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8fdd3521-1c60-4f39-8274-132b679755c8-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497512 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fdd3521-1c60-4f39-8274-132b679755c8-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497603 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-scripts\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.497723 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.600399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-scripts\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.601098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8fhn\" (UniqueName: \"kubernetes.io/projected/53d57537-1158-4e9f-bf5f-bfa4976bdd83-kube-api-access-w8fhn\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.601144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.601172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.601211 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-scripts\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.601245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb6mj\" (UniqueName: \"kubernetes.io/projected/8fdd3521-1c60-4f39-8274-132b679755c8-kube-api-access-hb6mj\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.603458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.603644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-config-data\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.603740 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.603810 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.603906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-dns-svc\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-config\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8fdd3521-1c60-4f39-8274-132b679755c8-ceph\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604183 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-config-data\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8fdd3521-1c60-4f39-8274-132b679755c8-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604339 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fdd3521-1c60-4f39-8274-132b679755c8-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604495 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/036e2782-a3e8-4e6c-866a-adc689ba24dd-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.604538 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbfnz\" (UniqueName: \"kubernetes.io/projected/036e2782-a3e8-4e6c-866a-adc689ba24dd-kube-api-access-nbfnz\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.609375 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-dns-svc\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.610056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fdd3521-1c60-4f39-8274-132b679755c8-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.610210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8fdd3521-1c60-4f39-8274-132b679755c8-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.611625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-config\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.617191 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.620922 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.623160 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.626598 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.629292 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.629666 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-scripts\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.629733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.630951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdd3521-1c60-4f39-8274-132b679755c8-config-data\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.635873 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.637859 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb6mj\" (UniqueName: \"kubernetes.io/projected/8fdd3521-1c60-4f39-8274-132b679755c8-kube-api-access-hb6mj\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.651647 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8fdd3521-1c60-4f39-8274-132b679755c8-ceph\") pod \"manila-share-share1-0\" (UID: \"8fdd3521-1c60-4f39-8274-132b679755c8\") " pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.653309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8fhn\" (UniqueName: \"kubernetes.io/projected/53d57537-1158-4e9f-bf5f-bfa4976bdd83-kube-api-access-w8fhn\") pod \"dnsmasq-dns-69b4fbdf9f-ff9q7\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-config-data-custom\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e936cd-50d2-4d95-850d-5ca4808b22fa-logs\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8v9z\" (UniqueName: \"kubernetes.io/projected/39e936cd-50d2-4d95-850d-5ca4808b22fa-kube-api-access-q8v9z\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706425 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-scripts\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706453 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-config-data\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706567 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706600 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706646 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-scripts\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-config-data\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706731 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/036e2782-a3e8-4e6c-866a-adc689ba24dd-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbfnz\" (UniqueName: \"kubernetes.io/projected/036e2782-a3e8-4e6c-866a-adc689ba24dd-kube-api-access-nbfnz\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.706847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39e936cd-50d2-4d95-850d-5ca4808b22fa-etc-machine-id\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.708603 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/036e2782-a3e8-4e6c-866a-adc689ba24dd-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.714768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.715078 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-scripts\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.715520 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.716012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.724554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/036e2782-a3e8-4e6c-866a-adc689ba24dd-config-data\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.734447 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbfnz\" (UniqueName: \"kubernetes.io/projected/036e2782-a3e8-4e6c-866a-adc689ba24dd-kube-api-access-nbfnz\") pod \"manila-scheduler-0\" (UID: \"036e2782-a3e8-4e6c-866a-adc689ba24dd\") " pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.770169 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.796762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39e936cd-50d2-4d95-850d-5ca4808b22fa-etc-machine-id\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39e936cd-50d2-4d95-850d-5ca4808b22fa-etc-machine-id\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809121 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-config-data-custom\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e936cd-50d2-4d95-850d-5ca4808b22fa-logs\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8v9z\" (UniqueName: \"kubernetes.io/projected/39e936cd-50d2-4d95-850d-5ca4808b22fa-kube-api-access-q8v9z\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-config-data\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-scripts\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809687 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.809869 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e936cd-50d2-4d95-850d-5ca4808b22fa-logs\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.822006 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-config-data\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.827990 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-scripts\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.828907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-config-data-custom\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.830402 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e936cd-50d2-4d95-850d-5ca4808b22fa-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.831427 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8v9z\" (UniqueName: \"kubernetes.io/projected/39e936cd-50d2-4d95-850d-5ca4808b22fa-kube-api-access-q8v9z\") pod \"manila-api-0\" (UID: \"39e936cd-50d2-4d95-850d-5ca4808b22fa\") " pod="openstack/manila-api-0" Oct 08 14:40:44 crc kubenswrapper[4735]: I1008 14:40:44.985134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 08 14:40:45 crc kubenswrapper[4735]: I1008 14:40:45.856095 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4fbdf9f-ff9q7"] Oct 08 14:40:45 crc kubenswrapper[4735]: I1008 14:40:45.916706 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 08 14:40:45 crc kubenswrapper[4735]: I1008 14:40:45.968971 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 08 14:40:46 crc kubenswrapper[4735]: I1008 14:40:46.061288 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 08 14:40:46 crc kubenswrapper[4735]: I1008 14:40:46.064233 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"036e2782-a3e8-4e6c-866a-adc689ba24dd","Type":"ContainerStarted","Data":"34b8a5fc9e8ec4cdb181d029806784d16c98253a6facbef59497b10e2766f45d"} Oct 08 14:40:46 crc kubenswrapper[4735]: W1008 14:40:46.065445 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39e936cd_50d2_4d95_850d_5ca4808b22fa.slice/crio-99be306c0080511b3e4a5a606466e00a4b0ef0ba5c41697e97d98d0dea2acdf0 WatchSource:0}: Error finding container 99be306c0080511b3e4a5a606466e00a4b0ef0ba5c41697e97d98d0dea2acdf0: Status 404 returned error can't find the container with id 99be306c0080511b3e4a5a606466e00a4b0ef0ba5c41697e97d98d0dea2acdf0 Oct 08 14:40:46 crc kubenswrapper[4735]: I1008 14:40:46.067599 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8fdd3521-1c60-4f39-8274-132b679755c8","Type":"ContainerStarted","Data":"529c6ebad457cb259627583a53d9840648467b0cd035801d56677e5b0d39a7e5"} Oct 08 14:40:46 crc kubenswrapper[4735]: I1008 14:40:46.070302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" event={"ID":"53d57537-1158-4e9f-bf5f-bfa4976bdd83","Type":"ContainerStarted","Data":"6a2ccc081ac75f50d8baaf3f64884577d31354fe2a2f176a390e5d54220749ca"} Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.041704 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-aa99-account-create-ssv2z"] Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.053524 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-aa99-account-create-ssv2z"] Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.081323 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"39e936cd-50d2-4d95-850d-5ca4808b22fa","Type":"ContainerStarted","Data":"4879375561db0d12b059cfd8a4d37e00470a4e5ea53a3e8e5b2520ff8ec252d7"} Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.081371 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"39e936cd-50d2-4d95-850d-5ca4808b22fa","Type":"ContainerStarted","Data":"99be306c0080511b3e4a5a606466e00a4b0ef0ba5c41697e97d98d0dea2acdf0"} Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.083275 4735 generic.go:334] "Generic (PLEG): container finished" podID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerID="d8a3b7490dac0968cc7364fe9459eb8353ee5d8e62a8175b65694b8dd75c1226" exitCode=0 Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.083326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" event={"ID":"53d57537-1158-4e9f-bf5f-bfa4976bdd83","Type":"ContainerDied","Data":"d8a3b7490dac0968cc7364fe9459eb8353ee5d8e62a8175b65694b8dd75c1226"} Oct 08 14:40:47 crc kubenswrapper[4735]: I1008 14:40:47.894661 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:40:47 crc kubenswrapper[4735]: E1008 14:40:47.895262 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.126831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" event={"ID":"53d57537-1158-4e9f-bf5f-bfa4976bdd83","Type":"ContainerStarted","Data":"8d33c9dad9ce07363b3a0297a4a5005bccb86e5e975526dd737dfa2ef3900470"} Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.127189 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.142019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"39e936cd-50d2-4d95-850d-5ca4808b22fa","Type":"ContainerStarted","Data":"bebf958027c6f5336d22fa177a8f94c76c461e376148c93e074b791fd7b4fbb5"} Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.143322 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.194342 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" podStartSLOduration=4.19432022 podStartE2EDuration="4.19432022s" podCreationTimestamp="2025-10-08 14:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:40:48.180808964 +0000 UTC m=+6297.989113518" watchObservedRunningTime="2025-10-08 14:40:48.19432022 +0000 UTC m=+6298.002624764" Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.226930 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.226909483 podStartE2EDuration="4.226909483s" podCreationTimestamp="2025-10-08 14:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:40:48.222832199 +0000 UTC m=+6298.031136803" watchObservedRunningTime="2025-10-08 14:40:48.226909483 +0000 UTC m=+6298.035214027" Oct 08 14:40:48 crc kubenswrapper[4735]: I1008 14:40:48.911527 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5576d690-ea6f-4fdc-aef8-936a637dfb11" path="/var/lib/kubelet/pods/5576d690-ea6f-4fdc-aef8-936a637dfb11/volumes" Oct 08 14:40:49 crc kubenswrapper[4735]: I1008 14:40:49.163169 4735 generic.go:334] "Generic (PLEG): container finished" podID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerID="71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c" exitCode=0 Oct 08 14:40:49 crc kubenswrapper[4735]: I1008 14:40:49.163266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerDied","Data":"71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c"} Oct 08 14:40:49 crc kubenswrapper[4735]: I1008 14:40:49.172108 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"036e2782-a3e8-4e6c-866a-adc689ba24dd","Type":"ContainerStarted","Data":"69ccb29e291a9bea03be010243a8cb4f516f8977328c5e9ad2805e4e9645b8ca"} Oct 08 14:40:50 crc kubenswrapper[4735]: I1008 14:40:50.186065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerStarted","Data":"5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4"} Oct 08 14:40:50 crc kubenswrapper[4735]: I1008 14:40:50.188314 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"036e2782-a3e8-4e6c-866a-adc689ba24dd","Type":"ContainerStarted","Data":"fa343253f037e026f53fbf786d1592194642e2824c87b8199e2187ba5d5ddd75"} Oct 08 14:40:50 crc kubenswrapper[4735]: I1008 14:40:50.223728 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7z8p" podStartSLOduration=5.422153208 podStartE2EDuration="16.223710078s" podCreationTimestamp="2025-10-08 14:40:34 +0000 UTC" firstStartedPulling="2025-10-08 14:40:38.970506476 +0000 UTC m=+6288.778811020" lastFinishedPulling="2025-10-08 14:40:49.772063336 +0000 UTC m=+6299.580367890" observedRunningTime="2025-10-08 14:40:50.206139329 +0000 UTC m=+6300.014443873" watchObservedRunningTime="2025-10-08 14:40:50.223710078 +0000 UTC m=+6300.032014632" Oct 08 14:40:50 crc kubenswrapper[4735]: I1008 14:40:50.232209 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.081996126 podStartE2EDuration="6.232191225s" podCreationTimestamp="2025-10-08 14:40:44 +0000 UTC" firstStartedPulling="2025-10-08 14:40:45.919140614 +0000 UTC m=+6295.727445158" lastFinishedPulling="2025-10-08 14:40:48.069335713 +0000 UTC m=+6297.877640257" observedRunningTime="2025-10-08 14:40:50.223121563 +0000 UTC m=+6300.031426107" watchObservedRunningTime="2025-10-08 14:40:50.232191225 +0000 UTC m=+6300.040495769" Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.031194 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g67m4"] Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.044787 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g67m4"] Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.771874 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.798255 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.844617 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfc4cdccc-plxwb"] Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.844868 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="dnsmasq-dns" containerID="cri-o://676cc7221d5df50059f74df21ce11cc07f6e07a30a635e8ca4e138a35ecaf47b" gracePeriod=10 Oct 08 14:40:54 crc kubenswrapper[4735]: I1008 14:40:54.926950 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01de476-2104-4081-a3b4-3bc6710d0d1a" path="/var/lib/kubelet/pods/c01de476-2104-4081-a3b4-3bc6710d0d1a/volumes" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.043009 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.044785 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.275606 4735 generic.go:334] "Generic (PLEG): container finished" podID="b01de70e-d419-4718-9626-b1b3936427ff" containerID="676cc7221d5df50059f74df21ce11cc07f6e07a30a635e8ca4e138a35ecaf47b" exitCode=0 Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.275676 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" event={"ID":"b01de70e-d419-4718-9626-b1b3936427ff","Type":"ContainerDied","Data":"676cc7221d5df50059f74df21ce11cc07f6e07a30a635e8ca4e138a35ecaf47b"} Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.280760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8fdd3521-1c60-4f39-8274-132b679755c8","Type":"ContainerStarted","Data":"93374e4db1d11191efc53b8180418b8b3cfeea1c838df9aa06b91c54b0ac50a5"} Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.280857 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8fdd3521-1c60-4f39-8274-132b679755c8","Type":"ContainerStarted","Data":"ee19edd68fb1a79955af5d139303df604919c3963f7b3a3a8db9696ebf83a567"} Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.305859 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.079360041 podStartE2EDuration="11.305843613s" podCreationTimestamp="2025-10-08 14:40:44 +0000 UTC" firstStartedPulling="2025-10-08 14:40:45.970687462 +0000 UTC m=+6295.778992006" lastFinishedPulling="2025-10-08 14:40:54.197171034 +0000 UTC m=+6304.005475578" observedRunningTime="2025-10-08 14:40:55.300451855 +0000 UTC m=+6305.108756399" watchObservedRunningTime="2025-10-08 14:40:55.305843613 +0000 UTC m=+6305.114148157" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.500659 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.664928 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-sb\") pod \"b01de70e-d419-4718-9626-b1b3936427ff\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.665038 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-config\") pod \"b01de70e-d419-4718-9626-b1b3936427ff\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.665063 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-dns-svc\") pod \"b01de70e-d419-4718-9626-b1b3936427ff\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.665167 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdsrk\" (UniqueName: \"kubernetes.io/projected/b01de70e-d419-4718-9626-b1b3936427ff-kube-api-access-bdsrk\") pod \"b01de70e-d419-4718-9626-b1b3936427ff\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.665298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-nb\") pod \"b01de70e-d419-4718-9626-b1b3936427ff\" (UID: \"b01de70e-d419-4718-9626-b1b3936427ff\") " Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.674842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01de70e-d419-4718-9626-b1b3936427ff-kube-api-access-bdsrk" (OuterVolumeSpecName: "kube-api-access-bdsrk") pod "b01de70e-d419-4718-9626-b1b3936427ff" (UID: "b01de70e-d419-4718-9626-b1b3936427ff"). InnerVolumeSpecName "kube-api-access-bdsrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.724117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b01de70e-d419-4718-9626-b1b3936427ff" (UID: "b01de70e-d419-4718-9626-b1b3936427ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.739589 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b01de70e-d419-4718-9626-b1b3936427ff" (UID: "b01de70e-d419-4718-9626-b1b3936427ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.757084 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b01de70e-d419-4718-9626-b1b3936427ff" (UID: "b01de70e-d419-4718-9626-b1b3936427ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.769157 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.769197 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdsrk\" (UniqueName: \"kubernetes.io/projected/b01de70e-d419-4718-9626-b1b3936427ff-kube-api-access-bdsrk\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.769212 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.769225 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.786016 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-config" (OuterVolumeSpecName: "config") pod "b01de70e-d419-4718-9626-b1b3936427ff" (UID: "b01de70e-d419-4718-9626-b1b3936427ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:40:55 crc kubenswrapper[4735]: I1008 14:40:55.871535 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01de70e-d419-4718-9626-b1b3936427ff-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.105166 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7z8p" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" probeResult="failure" output=< Oct 08 14:40:56 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:40:56 crc kubenswrapper[4735]: > Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.291331 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" event={"ID":"b01de70e-d419-4718-9626-b1b3936427ff","Type":"ContainerDied","Data":"ff18386b89e89e395c892a14945b94931e5d8c6aeeb5dad6b3173be669569b6e"} Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.291416 4735 scope.go:117] "RemoveContainer" containerID="676cc7221d5df50059f74df21ce11cc07f6e07a30a635e8ca4e138a35ecaf47b" Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.291415 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.329840 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfc4cdccc-plxwb"] Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.333156 4735 scope.go:117] "RemoveContainer" containerID="6a43159a1659f1fe9db82ae9abc184fa5c02bdc29a107d5958696fdcceb386a6" Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.340701 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfc4cdccc-plxwb"] Oct 08 14:40:56 crc kubenswrapper[4735]: I1008 14:40:56.907976 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01de70e-d419-4718-9626-b1b3936427ff" path="/var/lib/kubelet/pods/b01de70e-d419-4718-9626-b1b3936427ff/volumes" Oct 08 14:40:59 crc kubenswrapper[4735]: I1008 14:40:59.301544 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:40:59 crc kubenswrapper[4735]: I1008 14:40:59.302203 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-central-agent" containerID="cri-o://1396a8349decae037cca9020394cda6d20fd37248e07e7b2f9995aaaeb6e2124" gracePeriod=30 Oct 08 14:40:59 crc kubenswrapper[4735]: I1008 14:40:59.302300 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="proxy-httpd" containerID="cri-o://8e301d7027ea33447bb369464343aaad54ebe01e3763ee4bd73eeef5b69ad7ea" gracePeriod=30 Oct 08 14:40:59 crc kubenswrapper[4735]: I1008 14:40:59.302413 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="sg-core" containerID="cri-o://a0b83df9c69eb9d2b39ea303b2935a8381ff2949f30e1ef2c051fb0d1dcac46c" gracePeriod=30 Oct 08 14:40:59 crc kubenswrapper[4735]: I1008 14:40:59.302516 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-notification-agent" containerID="cri-o://ffa082810632475e36da4c1a417cea8600e3322cc675f9d54e63ce4cf7e2a320" gracePeriod=30 Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.339463 4735 generic.go:334] "Generic (PLEG): container finished" podID="07c48feb-11be-4464-9082-b8358ffe86a2" containerID="8e301d7027ea33447bb369464343aaad54ebe01e3763ee4bd73eeef5b69ad7ea" exitCode=0 Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.340110 4735 generic.go:334] "Generic (PLEG): container finished" podID="07c48feb-11be-4464-9082-b8358ffe86a2" containerID="a0b83df9c69eb9d2b39ea303b2935a8381ff2949f30e1ef2c051fb0d1dcac46c" exitCode=2 Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.340127 4735 generic.go:334] "Generic (PLEG): container finished" podID="07c48feb-11be-4464-9082-b8358ffe86a2" containerID="1396a8349decae037cca9020394cda6d20fd37248e07e7b2f9995aaaeb6e2124" exitCode=0 Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.339694 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerDied","Data":"8e301d7027ea33447bb369464343aaad54ebe01e3763ee4bd73eeef5b69ad7ea"} Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.340171 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerDied","Data":"a0b83df9c69eb9d2b39ea303b2935a8381ff2949f30e1ef2c051fb0d1dcac46c"} Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.340191 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerDied","Data":"1396a8349decae037cca9020394cda6d20fd37248e07e7b2f9995aaaeb6e2124"} Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.367869 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cfc4cdccc-plxwb" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.78:5353: i/o timeout" Oct 08 14:41:00 crc kubenswrapper[4735]: I1008 14:41:00.902822 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:41:01 crc kubenswrapper[4735]: I1008 14:41:01.354659 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"fb4a9e663866841ddb83c52bf2f4a989212e0d3306474a6bb87200f186570090"} Oct 08 14:41:02 crc kubenswrapper[4735]: I1008 14:41:02.372679 4735 generic.go:334] "Generic (PLEG): container finished" podID="07c48feb-11be-4464-9082-b8358ffe86a2" containerID="ffa082810632475e36da4c1a417cea8600e3322cc675f9d54e63ce4cf7e2a320" exitCode=0 Oct 08 14:41:02 crc kubenswrapper[4735]: I1008 14:41:02.373012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerDied","Data":"ffa082810632475e36da4c1a417cea8600e3322cc675f9d54e63ce4cf7e2a320"} Oct 08 14:41:02 crc kubenswrapper[4735]: E1008 14:41:02.448554 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07c48feb_11be_4464_9082_b8358ffe86a2.slice/crio-conmon-ffa082810632475e36da4c1a417cea8600e3322cc675f9d54e63ce4cf7e2a320.scope\": RecentStats: unable to find data in memory cache]" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.259330 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.393753 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07c48feb-11be-4464-9082-b8358ffe86a2","Type":"ContainerDied","Data":"f9f81589875c153269d393a7047acd254031e21337b8c460658f4cea1c5ee039"} Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.393804 4735 scope.go:117] "RemoveContainer" containerID="8e301d7027ea33447bb369464343aaad54ebe01e3763ee4bd73eeef5b69ad7ea" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.393945 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.423636 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-log-httpd\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.423727 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-scripts\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.423865 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-combined-ca-bundle\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.423919 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snnf2\" (UniqueName: \"kubernetes.io/projected/07c48feb-11be-4464-9082-b8358ffe86a2-kube-api-access-snnf2\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.423989 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-run-httpd\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.424026 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-sg-core-conf-yaml\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.424095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-config-data\") pod \"07c48feb-11be-4464-9082-b8358ffe86a2\" (UID: \"07c48feb-11be-4464-9082-b8358ffe86a2\") " Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.425228 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.425729 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.430793 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c48feb-11be-4464-9082-b8358ffe86a2-kube-api-access-snnf2" (OuterVolumeSpecName: "kube-api-access-snnf2") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "kube-api-access-snnf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.431351 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-scripts" (OuterVolumeSpecName: "scripts") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.474396 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.506181 4735 scope.go:117] "RemoveContainer" containerID="a0b83df9c69eb9d2b39ea303b2935a8381ff2949f30e1ef2c051fb0d1dcac46c" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.526557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.527165 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.527195 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.527204 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.527217 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snnf2\" (UniqueName: \"kubernetes.io/projected/07c48feb-11be-4464-9082-b8358ffe86a2-kube-api-access-snnf2\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.527225 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07c48feb-11be-4464-9082-b8358ffe86a2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.527233 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.563684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-config-data" (OuterVolumeSpecName: "config-data") pod "07c48feb-11be-4464-9082-b8358ffe86a2" (UID: "07c48feb-11be-4464-9082-b8358ffe86a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.629240 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c48feb-11be-4464-9082-b8358ffe86a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.739144 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.750000 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.760194 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:41:03 crc kubenswrapper[4735]: E1008 14:41:03.760813 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="dnsmasq-dns" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.760841 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="dnsmasq-dns" Oct 08 14:41:03 crc kubenswrapper[4735]: E1008 14:41:03.760879 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="proxy-httpd" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.760891 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="proxy-httpd" Oct 08 14:41:03 crc kubenswrapper[4735]: E1008 14:41:03.760915 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="sg-core" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.760923 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="sg-core" Oct 08 14:41:03 crc kubenswrapper[4735]: E1008 14:41:03.760943 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="init" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.760952 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="init" Oct 08 14:41:03 crc kubenswrapper[4735]: E1008 14:41:03.760987 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-notification-agent" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.760996 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-notification-agent" Oct 08 14:41:03 crc kubenswrapper[4735]: E1008 14:41:03.761010 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-central-agent" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.761020 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-central-agent" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.761270 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-central-agent" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.761293 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="ceilometer-notification-agent" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.761307 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01de70e-d419-4718-9626-b1b3936427ff" containerName="dnsmasq-dns" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.761335 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="proxy-httpd" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.761354 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" containerName="sg-core" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.794240 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.797323 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.799532 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.803660 4735 scope.go:117] "RemoveContainer" containerID="ffa082810632475e36da4c1a417cea8600e3322cc675f9d54e63ce4cf7e2a320" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.823930 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.885762 4735 scope.go:117] "RemoveContainer" containerID="1396a8349decae037cca9020394cda6d20fd37248e07e7b2f9995aaaeb6e2124" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.934400 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-config-data\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.934564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.934631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-scripts\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.934757 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2b29\" (UniqueName: \"kubernetes.io/projected/410f0f14-7078-4621-b8b0-6c507c16fe39-kube-api-access-v2b29\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.935002 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.935252 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/410f0f14-7078-4621-b8b0-6c507c16fe39-log-httpd\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:03 crc kubenswrapper[4735]: I1008 14:41:03.935398 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/410f0f14-7078-4621-b8b0-6c507c16fe39-run-httpd\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.037890 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2b29\" (UniqueName: \"kubernetes.io/projected/410f0f14-7078-4621-b8b0-6c507c16fe39-kube-api-access-v2b29\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.038950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.039173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/410f0f14-7078-4621-b8b0-6c507c16fe39-log-httpd\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.039305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/410f0f14-7078-4621-b8b0-6c507c16fe39-run-httpd\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.039622 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/410f0f14-7078-4621-b8b0-6c507c16fe39-log-httpd\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.039728 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/410f0f14-7078-4621-b8b0-6c507c16fe39-run-httpd\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.040430 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-config-data\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.040451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.040914 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-scripts\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.044350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.044812 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-config-data\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.045844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.046237 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/410f0f14-7078-4621-b8b0-6c507c16fe39-scripts\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.054913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2b29\" (UniqueName: \"kubernetes.io/projected/410f0f14-7078-4621-b8b0-6c507c16fe39-kube-api-access-v2b29\") pod \"ceilometer-0\" (UID: \"410f0f14-7078-4621-b8b0-6c507c16fe39\") " pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.132265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.716535 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.777411 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.908924 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c48feb-11be-4464-9082-b8358ffe86a2" path="/var/lib/kubelet/pods/07c48feb-11be-4464-9082-b8358ffe86a2/volumes" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.915182 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hrn4b"] Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.918845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:04 crc kubenswrapper[4735]: I1008 14:41:04.936049 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrn4b"] Oct 08 14:41:05 crc kubenswrapper[4735]: W1008 14:41:05.004660 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod410f0f14_7078_4621_b8b0_6c507c16fe39.slice/crio-af7dc6f65a7d01a05758c2b972f32b1aa1a38f2a37d4ffc5f799046cbfe3413d WatchSource:0}: Error finding container af7dc6f65a7d01a05758c2b972f32b1aa1a38f2a37d4ffc5f799046cbfe3413d: Status 404 returned error can't find the container with id af7dc6f65a7d01a05758c2b972f32b1aa1a38f2a37d4ffc5f799046cbfe3413d Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.062516 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thnzq\" (UniqueName: \"kubernetes.io/projected/b2e797a4-a21f-4d7c-8e18-0a31f598d010-kube-api-access-thnzq\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.062714 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-catalog-content\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.063417 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-utilities\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.165440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-utilities\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.165660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thnzq\" (UniqueName: \"kubernetes.io/projected/b2e797a4-a21f-4d7c-8e18-0a31f598d010-kube-api-access-thnzq\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.165760 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-catalog-content\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.165926 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-utilities\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.166202 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-catalog-content\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.193671 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thnzq\" (UniqueName: \"kubernetes.io/projected/b2e797a4-a21f-4d7c-8e18-0a31f598d010-kube-api-access-thnzq\") pod \"certified-operators-hrn4b\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.248495 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.435427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"410f0f14-7078-4621-b8b0-6c507c16fe39","Type":"ContainerStarted","Data":"af7dc6f65a7d01a05758c2b972f32b1aa1a38f2a37d4ffc5f799046cbfe3413d"} Oct 08 14:41:05 crc kubenswrapper[4735]: I1008 14:41:05.887026 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrn4b"] Oct 08 14:41:05 crc kubenswrapper[4735]: W1008 14:41:05.887534 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e797a4_a21f_4d7c_8e18_0a31f598d010.slice/crio-06b09c30d3094b98912bc2a7e960b740c2f7c8a0dc8a0ef362f24f2dde88c08e WatchSource:0}: Error finding container 06b09c30d3094b98912bc2a7e960b740c2f7c8a0dc8a0ef362f24f2dde88c08e: Status 404 returned error can't find the container with id 06b09c30d3094b98912bc2a7e960b740c2f7c8a0dc8a0ef362f24f2dde88c08e Oct 08 14:41:06 crc kubenswrapper[4735]: I1008 14:41:06.109317 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7z8p" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" probeResult="failure" output=< Oct 08 14:41:06 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:41:06 crc kubenswrapper[4735]: > Oct 08 14:41:06 crc kubenswrapper[4735]: I1008 14:41:06.446709 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerStarted","Data":"479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19"} Oct 08 14:41:06 crc kubenswrapper[4735]: I1008 14:41:06.447049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerStarted","Data":"06b09c30d3094b98912bc2a7e960b740c2f7c8a0dc8a0ef362f24f2dde88c08e"} Oct 08 14:41:07 crc kubenswrapper[4735]: I1008 14:41:07.471036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"410f0f14-7078-4621-b8b0-6c507c16fe39","Type":"ContainerStarted","Data":"8854ff726c7c86100c2d68f4cbff245aa3f085d34a519e877ce2cb0221f67920"} Oct 08 14:41:07 crc kubenswrapper[4735]: I1008 14:41:07.486203 4735 generic.go:334] "Generic (PLEG): container finished" podID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerID="479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19" exitCode=0 Oct 08 14:41:07 crc kubenswrapper[4735]: I1008 14:41:07.486594 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerDied","Data":"479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19"} Oct 08 14:41:07 crc kubenswrapper[4735]: I1008 14:41:07.510276 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 08 14:41:07 crc kubenswrapper[4735]: I1008 14:41:07.637855 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 08 14:41:08 crc kubenswrapper[4735]: I1008 14:41:08.020399 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 08 14:41:08 crc kubenswrapper[4735]: I1008 14:41:08.516234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"410f0f14-7078-4621-b8b0-6c507c16fe39","Type":"ContainerStarted","Data":"d36f334d17b2f478c95aa8740203bf7aa24dafa6ebe67cb22c12a8bcb12b79ac"} Oct 08 14:41:10 crc kubenswrapper[4735]: I1008 14:41:10.538538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"410f0f14-7078-4621-b8b0-6c507c16fe39","Type":"ContainerStarted","Data":"84e272f8d36b944c761d423a041e28ebc4ba62c944e4882a95b79068c9291f90"} Oct 08 14:41:10 crc kubenswrapper[4735]: I1008 14:41:10.546045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerStarted","Data":"7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8"} Oct 08 14:41:12 crc kubenswrapper[4735]: I1008 14:41:12.096645 4735 scope.go:117] "RemoveContainer" containerID="486c1fcf4c55ca53b748a7ee39a449db3fc7865cbd5460f7007ac6d5048850bb" Oct 08 14:41:12 crc kubenswrapper[4735]: I1008 14:41:12.472310 4735 scope.go:117] "RemoveContainer" containerID="4483a05a5ed63a95ca13598dcd4f9c3259ff569835b4aec1abaff98c9d82a6f2" Oct 08 14:41:12 crc kubenswrapper[4735]: I1008 14:41:12.639582 4735 scope.go:117] "RemoveContainer" containerID="cf054f7ce53eabb98051e30c2b9837257155604b3a03de291a0866fea38c3cfa" Oct 08 14:41:14 crc kubenswrapper[4735]: I1008 14:41:14.595225 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"410f0f14-7078-4621-b8b0-6c507c16fe39","Type":"ContainerStarted","Data":"6381b0f7c0db830e6e743ca055195ec954aa17fc766ecd303aa19e18100bf8dc"} Oct 08 14:41:14 crc kubenswrapper[4735]: I1008 14:41:14.597425 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 14:41:14 crc kubenswrapper[4735]: I1008 14:41:14.653084 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.4445646070000002 podStartE2EDuration="11.653065009s" podCreationTimestamp="2025-10-08 14:41:03 +0000 UTC" firstStartedPulling="2025-10-08 14:41:05.007369435 +0000 UTC m=+6314.815673979" lastFinishedPulling="2025-10-08 14:41:13.215869837 +0000 UTC m=+6323.024174381" observedRunningTime="2025-10-08 14:41:14.626192891 +0000 UTC m=+6324.434497465" watchObservedRunningTime="2025-10-08 14:41:14.653065009 +0000 UTC m=+6324.461369553" Oct 08 14:41:15 crc kubenswrapper[4735]: I1008 14:41:15.607025 4735 generic.go:334] "Generic (PLEG): container finished" podID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerID="7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8" exitCode=0 Oct 08 14:41:15 crc kubenswrapper[4735]: I1008 14:41:15.607111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerDied","Data":"7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8"} Oct 08 14:41:16 crc kubenswrapper[4735]: I1008 14:41:16.116774 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7z8p" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" probeResult="failure" output=< Oct 08 14:41:16 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:41:16 crc kubenswrapper[4735]: > Oct 08 14:41:17 crc kubenswrapper[4735]: I1008 14:41:17.632385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerStarted","Data":"3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7"} Oct 08 14:41:17 crc kubenswrapper[4735]: I1008 14:41:17.656451 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hrn4b" podStartSLOduration=4.767120226 podStartE2EDuration="13.65643025s" podCreationTimestamp="2025-10-08 14:41:04 +0000 UTC" firstStartedPulling="2025-10-08 14:41:07.520238562 +0000 UTC m=+6317.328543106" lastFinishedPulling="2025-10-08 14:41:16.409548586 +0000 UTC m=+6326.217853130" observedRunningTime="2025-10-08 14:41:17.650724994 +0000 UTC m=+6327.459029548" watchObservedRunningTime="2025-10-08 14:41:17.65643025 +0000 UTC m=+6327.464734794" Oct 08 14:41:25 crc kubenswrapper[4735]: I1008 14:41:25.106339 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:41:25 crc kubenswrapper[4735]: I1008 14:41:25.167293 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:41:25 crc kubenswrapper[4735]: I1008 14:41:25.249232 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:25 crc kubenswrapper[4735]: I1008 14:41:25.249284 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:25 crc kubenswrapper[4735]: I1008 14:41:25.348626 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7z8p"] Oct 08 14:41:26 crc kubenswrapper[4735]: I1008 14:41:26.298423 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hrn4b" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="registry-server" probeResult="failure" output=< Oct 08 14:41:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:41:26 crc kubenswrapper[4735]: > Oct 08 14:41:26 crc kubenswrapper[4735]: I1008 14:41:26.739375 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7z8p" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" containerID="cri-o://5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4" gracePeriod=2 Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.573214 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.695513 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-utilities\") pod \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.695605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-catalog-content\") pod \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.695796 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdwhk\" (UniqueName: \"kubernetes.io/projected/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-kube-api-access-fdwhk\") pod \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\" (UID: \"a0905e0e-781a-4cd4-b0c8-06354fb4ef52\") " Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.696507 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-utilities" (OuterVolumeSpecName: "utilities") pod "a0905e0e-781a-4cd4-b0c8-06354fb4ef52" (UID: "a0905e0e-781a-4cd4-b0c8-06354fb4ef52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.717974 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-kube-api-access-fdwhk" (OuterVolumeSpecName: "kube-api-access-fdwhk") pod "a0905e0e-781a-4cd4-b0c8-06354fb4ef52" (UID: "a0905e0e-781a-4cd4-b0c8-06354fb4ef52"). InnerVolumeSpecName "kube-api-access-fdwhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.750855 4735 generic.go:334] "Generic (PLEG): container finished" podID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerID="5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4" exitCode=0 Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.750911 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerDied","Data":"5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4"} Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.751043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7z8p" event={"ID":"a0905e0e-781a-4cd4-b0c8-06354fb4ef52","Type":"ContainerDied","Data":"466d28bfb3b3379670ad47af7fc773878cc8c5cdfcfadbb177bf29c98c44d0d2"} Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.751071 4735 scope.go:117] "RemoveContainer" containerID="5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.751213 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7z8p" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.778077 4735 scope.go:117] "RemoveContainer" containerID="71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.779687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0905e0e-781a-4cd4-b0c8-06354fb4ef52" (UID: "a0905e0e-781a-4cd4-b0c8-06354fb4ef52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.799675 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdwhk\" (UniqueName: \"kubernetes.io/projected/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-kube-api-access-fdwhk\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.799721 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.799736 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0905e0e-781a-4cd4-b0c8-06354fb4ef52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.801491 4735 scope.go:117] "RemoveContainer" containerID="2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.842009 4735 scope.go:117] "RemoveContainer" containerID="5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4" Oct 08 14:41:27 crc kubenswrapper[4735]: E1008 14:41:27.842454 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4\": container with ID starting with 5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4 not found: ID does not exist" containerID="5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.842616 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4"} err="failed to get container status \"5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4\": rpc error: code = NotFound desc = could not find container \"5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4\": container with ID starting with 5883150fcecc42f23ac5b6f3a77d4ebac9ac7ec711183b243e9cf5f0e4908af4 not found: ID does not exist" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.842643 4735 scope.go:117] "RemoveContainer" containerID="71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c" Oct 08 14:41:27 crc kubenswrapper[4735]: E1008 14:41:27.843011 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c\": container with ID starting with 71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c not found: ID does not exist" containerID="71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.843037 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c"} err="failed to get container status \"71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c\": rpc error: code = NotFound desc = could not find container \"71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c\": container with ID starting with 71a7d3332925d4079822a9ae0ab5ddd963d303346b71d6e4b474e5866a77c52c not found: ID does not exist" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.843051 4735 scope.go:117] "RemoveContainer" containerID="2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e" Oct 08 14:41:27 crc kubenswrapper[4735]: E1008 14:41:27.843332 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e\": container with ID starting with 2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e not found: ID does not exist" containerID="2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e" Oct 08 14:41:27 crc kubenswrapper[4735]: I1008 14:41:27.843355 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e"} err="failed to get container status \"2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e\": rpc error: code = NotFound desc = could not find container \"2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e\": container with ID starting with 2a027bd214cc151d5daac0a30092d651a2b8de103000503af634db7f19ad393e not found: ID does not exist" Oct 08 14:41:28 crc kubenswrapper[4735]: I1008 14:41:28.097782 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7z8p"] Oct 08 14:41:28 crc kubenswrapper[4735]: I1008 14:41:28.109186 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7z8p"] Oct 08 14:41:28 crc kubenswrapper[4735]: I1008 14:41:28.906150 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" path="/var/lib/kubelet/pods/a0905e0e-781a-4cd4-b0c8-06354fb4ef52/volumes" Oct 08 14:41:34 crc kubenswrapper[4735]: I1008 14:41:34.140167 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 14:41:35 crc kubenswrapper[4735]: I1008 14:41:35.300992 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:35 crc kubenswrapper[4735]: I1008 14:41:35.353206 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:36 crc kubenswrapper[4735]: I1008 14:41:36.118737 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrn4b"] Oct 08 14:41:36 crc kubenswrapper[4735]: I1008 14:41:36.848612 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hrn4b" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="registry-server" containerID="cri-o://3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7" gracePeriod=2 Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.334399 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.494221 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-utilities\") pod \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.494689 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-catalog-content\") pod \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.494720 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thnzq\" (UniqueName: \"kubernetes.io/projected/b2e797a4-a21f-4d7c-8e18-0a31f598d010-kube-api-access-thnzq\") pod \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\" (UID: \"b2e797a4-a21f-4d7c-8e18-0a31f598d010\") " Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.495241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-utilities" (OuterVolumeSpecName: "utilities") pod "b2e797a4-a21f-4d7c-8e18-0a31f598d010" (UID: "b2e797a4-a21f-4d7c-8e18-0a31f598d010"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.499541 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e797a4-a21f-4d7c-8e18-0a31f598d010-kube-api-access-thnzq" (OuterVolumeSpecName: "kube-api-access-thnzq") pod "b2e797a4-a21f-4d7c-8e18-0a31f598d010" (UID: "b2e797a4-a21f-4d7c-8e18-0a31f598d010"). InnerVolumeSpecName "kube-api-access-thnzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.541845 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2e797a4-a21f-4d7c-8e18-0a31f598d010" (UID: "b2e797a4-a21f-4d7c-8e18-0a31f598d010"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.597129 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.597167 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2e797a4-a21f-4d7c-8e18-0a31f598d010-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.597179 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thnzq\" (UniqueName: \"kubernetes.io/projected/b2e797a4-a21f-4d7c-8e18-0a31f598d010-kube-api-access-thnzq\") on node \"crc\" DevicePath \"\"" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.861412 4735 generic.go:334] "Generic (PLEG): container finished" podID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerID="3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7" exitCode=0 Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.861459 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerDied","Data":"3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7"} Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.861501 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrn4b" event={"ID":"b2e797a4-a21f-4d7c-8e18-0a31f598d010","Type":"ContainerDied","Data":"06b09c30d3094b98912bc2a7e960b740c2f7c8a0dc8a0ef362f24f2dde88c08e"} Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.861522 4735 scope.go:117] "RemoveContainer" containerID="3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.861724 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrn4b" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.892035 4735 scope.go:117] "RemoveContainer" containerID="7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.892732 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrn4b"] Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.901837 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hrn4b"] Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.914719 4735 scope.go:117] "RemoveContainer" containerID="479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.956994 4735 scope.go:117] "RemoveContainer" containerID="3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7" Oct 08 14:41:37 crc kubenswrapper[4735]: E1008 14:41:37.957543 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7\": container with ID starting with 3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7 not found: ID does not exist" containerID="3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.957603 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7"} err="failed to get container status \"3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7\": rpc error: code = NotFound desc = could not find container \"3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7\": container with ID starting with 3da8363a6fa947f25aa1c4df873e449a649ad51ad2a7850e54ff64944158faa7 not found: ID does not exist" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.957635 4735 scope.go:117] "RemoveContainer" containerID="7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8" Oct 08 14:41:37 crc kubenswrapper[4735]: E1008 14:41:37.958195 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8\": container with ID starting with 7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8 not found: ID does not exist" containerID="7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.958227 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8"} err="failed to get container status \"7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8\": rpc error: code = NotFound desc = could not find container \"7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8\": container with ID starting with 7cada8a1ba9223e482c94cd84f5940a94eafc9a0fb4bfdd77717f857e56d07a8 not found: ID does not exist" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.958253 4735 scope.go:117] "RemoveContainer" containerID="479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19" Oct 08 14:41:37 crc kubenswrapper[4735]: E1008 14:41:37.958490 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19\": container with ID starting with 479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19 not found: ID does not exist" containerID="479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19" Oct 08 14:41:37 crc kubenswrapper[4735]: I1008 14:41:37.958514 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19"} err="failed to get container status \"479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19\": rpc error: code = NotFound desc = could not find container \"479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19\": container with ID starting with 479b2ae3c1f3c91b79f49271a92c429ac195503866be5aab8d70f78347381f19 not found: ID does not exist" Oct 08 14:41:38 crc kubenswrapper[4735]: I1008 14:41:38.912041 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" path="/var/lib/kubelet/pods/b2e797a4-a21f-4d7c-8e18-0a31f598d010/volumes" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.314822 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd999549-2vdhk"] Oct 08 14:41:59 crc kubenswrapper[4735]: E1008 14:41:59.315839 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="extract-content" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.315926 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="extract-content" Oct 08 14:41:59 crc kubenswrapper[4735]: E1008 14:41:59.315943 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="registry-server" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.315949 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="registry-server" Oct 08 14:41:59 crc kubenswrapper[4735]: E1008 14:41:59.315961 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="extract-utilities" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.315967 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="extract-utilities" Oct 08 14:41:59 crc kubenswrapper[4735]: E1008 14:41:59.315992 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="extract-utilities" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.315997 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="extract-utilities" Oct 08 14:41:59 crc kubenswrapper[4735]: E1008 14:41:59.316019 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="extract-content" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.316024 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="extract-content" Oct 08 14:41:59 crc kubenswrapper[4735]: E1008 14:41:59.316040 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.316045 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.316275 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e797a4-a21f-4d7c-8e18-0a31f598d010" containerName="registry-server" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.316309 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0905e0e-781a-4cd4-b0c8-06354fb4ef52" containerName="registry-server" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.317863 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.321199 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.336697 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd999549-2vdhk"] Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.388225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-dns-svc\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.388743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-sb\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.388955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-nb\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.389339 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz26d\" (UniqueName: \"kubernetes.io/projected/315103cc-5644-48ae-9409-9109d4c3d092-kube-api-access-cz26d\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.389510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-openstack-cell1\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.389635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-config\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.492125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-dns-svc\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.492221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-sb\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.492257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-nb\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.492285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz26d\" (UniqueName: \"kubernetes.io/projected/315103cc-5644-48ae-9409-9109d4c3d092-kube-api-access-cz26d\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.492311 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-openstack-cell1\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.492332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-config\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.493356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-config\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.494202 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-dns-svc\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.494856 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-nb\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.495248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-openstack-cell1\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.495666 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-sb\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.515108 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz26d\" (UniqueName: \"kubernetes.io/projected/315103cc-5644-48ae-9409-9109d4c3d092-kube-api-access-cz26d\") pod \"dnsmasq-dns-78dd999549-2vdhk\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:41:59 crc kubenswrapper[4735]: I1008 14:41:59.641459 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:42:00 crc kubenswrapper[4735]: I1008 14:42:00.188689 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd999549-2vdhk"] Oct 08 14:42:01 crc kubenswrapper[4735]: I1008 14:42:01.099913 4735 generic.go:334] "Generic (PLEG): container finished" podID="315103cc-5644-48ae-9409-9109d4c3d092" containerID="a464c5435aa556c8b5cb409bd40168bbc1dcc1e2b99d6713b75c4eb6261ba9ec" exitCode=0 Oct 08 14:42:01 crc kubenswrapper[4735]: I1008 14:42:01.100008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" event={"ID":"315103cc-5644-48ae-9409-9109d4c3d092","Type":"ContainerDied","Data":"a464c5435aa556c8b5cb409bd40168bbc1dcc1e2b99d6713b75c4eb6261ba9ec"} Oct 08 14:42:01 crc kubenswrapper[4735]: I1008 14:42:01.104589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" event={"ID":"315103cc-5644-48ae-9409-9109d4c3d092","Type":"ContainerStarted","Data":"5ecdaa83e46529e2cae16ff6cb5c835c94983c26fed9369a764cd92591bd788a"} Oct 08 14:42:02 crc kubenswrapper[4735]: I1008 14:42:02.116354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" event={"ID":"315103cc-5644-48ae-9409-9109d4c3d092","Type":"ContainerStarted","Data":"bf947012121cd178b1a4dcc2cbbfb5d1fc3c60ad9eef9e09276e525452a3afe0"} Oct 08 14:42:02 crc kubenswrapper[4735]: I1008 14:42:02.116696 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:42:02 crc kubenswrapper[4735]: I1008 14:42:02.142298 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" podStartSLOduration=3.142271937 podStartE2EDuration="3.142271937s" podCreationTimestamp="2025-10-08 14:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:42:02.133707248 +0000 UTC m=+6371.942011812" watchObservedRunningTime="2025-10-08 14:42:02.142271937 +0000 UTC m=+6371.950576491" Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.642642 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.700553 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4fbdf9f-ff9q7"] Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.700821 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="dnsmasq-dns" containerID="cri-o://8d33c9dad9ce07363b3a0297a4a5005bccb86e5e975526dd737dfa2ef3900470" gracePeriod=10 Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.771337 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.142:5353: connect: connection refused" Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.856322 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6b45bb5c-54t6j"] Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.859908 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:09 crc kubenswrapper[4735]: I1008 14:42:09.874242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6b45bb5c-54t6j"] Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.021410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-config\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.021554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-dns-svc\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.024384 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8th\" (UniqueName: \"kubernetes.io/projected/725b3e41-8e45-4284-b107-c0394209df22-kube-api-access-6d8th\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.025267 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-ovsdbserver-nb\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.025391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-openstack-cell1\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.026514 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-ovsdbserver-sb\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.129385 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-openstack-cell1\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.129930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-ovsdbserver-sb\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.130039 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-config\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.130104 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-dns-svc\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.130132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8th\" (UniqueName: \"kubernetes.io/projected/725b3e41-8e45-4284-b107-c0394209df22-kube-api-access-6d8th\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.130326 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-ovsdbserver-nb\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.131278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-ovsdbserver-nb\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.132018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-dns-svc\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.132307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-config\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.132327 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-ovsdbserver-sb\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.132498 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/725b3e41-8e45-4284-b107-c0394209df22-openstack-cell1\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.159465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8th\" (UniqueName: \"kubernetes.io/projected/725b3e41-8e45-4284-b107-c0394209df22-kube-api-access-6d8th\") pod \"dnsmasq-dns-d6b45bb5c-54t6j\" (UID: \"725b3e41-8e45-4284-b107-c0394209df22\") " pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.196012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.203744 4735 generic.go:334] "Generic (PLEG): container finished" podID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerID="8d33c9dad9ce07363b3a0297a4a5005bccb86e5e975526dd737dfa2ef3900470" exitCode=0 Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.203803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" event={"ID":"53d57537-1158-4e9f-bf5f-bfa4976bdd83","Type":"ContainerDied","Data":"8d33c9dad9ce07363b3a0297a4a5005bccb86e5e975526dd737dfa2ef3900470"} Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.359259 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.436124 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-dns-svc\") pod \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.436243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-sb\") pod \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.436286 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-config\") pod \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.436334 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8fhn\" (UniqueName: \"kubernetes.io/projected/53d57537-1158-4e9f-bf5f-bfa4976bdd83-kube-api-access-w8fhn\") pod \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.436441 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-nb\") pod \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\" (UID: \"53d57537-1158-4e9f-bf5f-bfa4976bdd83\") " Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.445241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d57537-1158-4e9f-bf5f-bfa4976bdd83-kube-api-access-w8fhn" (OuterVolumeSpecName: "kube-api-access-w8fhn") pod "53d57537-1158-4e9f-bf5f-bfa4976bdd83" (UID: "53d57537-1158-4e9f-bf5f-bfa4976bdd83"). InnerVolumeSpecName "kube-api-access-w8fhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.511017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53d57537-1158-4e9f-bf5f-bfa4976bdd83" (UID: "53d57537-1158-4e9f-bf5f-bfa4976bdd83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.512983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53d57537-1158-4e9f-bf5f-bfa4976bdd83" (UID: "53d57537-1158-4e9f-bf5f-bfa4976bdd83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.514716 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-config" (OuterVolumeSpecName: "config") pod "53d57537-1158-4e9f-bf5f-bfa4976bdd83" (UID: "53d57537-1158-4e9f-bf5f-bfa4976bdd83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.522537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53d57537-1158-4e9f-bf5f-bfa4976bdd83" (UID: "53d57537-1158-4e9f-bf5f-bfa4976bdd83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.538352 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8fhn\" (UniqueName: \"kubernetes.io/projected/53d57537-1158-4e9f-bf5f-bfa4976bdd83-kube-api-access-w8fhn\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.538391 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.538401 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.538410 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.538420 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d57537-1158-4e9f-bf5f-bfa4976bdd83-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:10 crc kubenswrapper[4735]: I1008 14:42:10.734855 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6b45bb5c-54t6j"] Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.215780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" event={"ID":"53d57537-1158-4e9f-bf5f-bfa4976bdd83","Type":"ContainerDied","Data":"6a2ccc081ac75f50d8baaf3f64884577d31354fe2a2f176a390e5d54220749ca"} Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.216085 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4fbdf9f-ff9q7" Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.216114 4735 scope.go:117] "RemoveContainer" containerID="8d33c9dad9ce07363b3a0297a4a5005bccb86e5e975526dd737dfa2ef3900470" Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.221376 4735 generic.go:334] "Generic (PLEG): container finished" podID="725b3e41-8e45-4284-b107-c0394209df22" containerID="4bbe9f6aaa7b47f7dfc068d20f8094f96bdabacd23623065a8227541d5ad72f7" exitCode=0 Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.221431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" event={"ID":"725b3e41-8e45-4284-b107-c0394209df22","Type":"ContainerDied","Data":"4bbe9f6aaa7b47f7dfc068d20f8094f96bdabacd23623065a8227541d5ad72f7"} Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.221464 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" event={"ID":"725b3e41-8e45-4284-b107-c0394209df22","Type":"ContainerStarted","Data":"9d6f27756dda6df20fc980c4e5225f2e1431fef8b1ca4b3416bb4378bb94bd81"} Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.248438 4735 scope.go:117] "RemoveContainer" containerID="d8a3b7490dac0968cc7364fe9459eb8353ee5d8e62a8175b65694b8dd75c1226" Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.254507 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4fbdf9f-ff9q7"] Oct 08 14:42:11 crc kubenswrapper[4735]: I1008 14:42:11.271154 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b4fbdf9f-ff9q7"] Oct 08 14:42:12 crc kubenswrapper[4735]: I1008 14:42:12.234528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" event={"ID":"725b3e41-8e45-4284-b107-c0394209df22","Type":"ContainerStarted","Data":"fd1a797d8a64e070ba733f600805b06a07f931b5ea8247de8a4128d32349a62e"} Oct 08 14:42:12 crc kubenswrapper[4735]: I1008 14:42:12.235067 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:12 crc kubenswrapper[4735]: I1008 14:42:12.253841 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" podStartSLOduration=3.253819757 podStartE2EDuration="3.253819757s" podCreationTimestamp="2025-10-08 14:42:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 14:42:12.252560935 +0000 UTC m=+6382.060865479" watchObservedRunningTime="2025-10-08 14:42:12.253819757 +0000 UTC m=+6382.062124301" Oct 08 14:42:12 crc kubenswrapper[4735]: I1008 14:42:12.907201 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" path="/var/lib/kubelet/pods/53d57537-1158-4e9f-bf5f-bfa4976bdd83/volumes" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.225101 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln"] Oct 08 14:42:16 crc kubenswrapper[4735]: E1008 14:42:16.226038 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="init" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.226055 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="init" Oct 08 14:42:16 crc kubenswrapper[4735]: E1008 14:42:16.226063 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="dnsmasq-dns" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.226070 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="dnsmasq-dns" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.226370 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d57537-1158-4e9f-bf5f-bfa4976bdd83" containerName="dnsmasq-dns" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.227391 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.229583 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.230027 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.230293 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xccq5" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.230839 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.237048 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln"] Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.375855 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.375995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.376099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.376130 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrxw\" (UniqueName: \"kubernetes.io/projected/d1e38574-c345-4fb1-b368-0499eeea2f3f-kube-api-access-rvrxw\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.376188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.478443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.478825 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrxw\" (UniqueName: \"kubernetes.io/projected/d1e38574-c345-4fb1-b368-0499eeea2f3f-kube-api-access-rvrxw\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.479009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.479182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.479502 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.485821 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.487012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.487341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.488774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.498867 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrxw\" (UniqueName: \"kubernetes.io/projected/d1e38574-c345-4fb1-b368-0499eeea2f3f-kube-api-access-rvrxw\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:16 crc kubenswrapper[4735]: I1008 14:42:16.546927 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:17 crc kubenswrapper[4735]: I1008 14:42:17.087837 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln"] Oct 08 14:42:17 crc kubenswrapper[4735]: W1008 14:42:17.089506 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e38574_c345_4fb1_b368_0499eeea2f3f.slice/crio-5e900617972e620e491d92f70e1c9d0ec965b4e55800e1d8f0d17a9d50994d1f WatchSource:0}: Error finding container 5e900617972e620e491d92f70e1c9d0ec965b4e55800e1d8f0d17a9d50994d1f: Status 404 returned error can't find the container with id 5e900617972e620e491d92f70e1c9d0ec965b4e55800e1d8f0d17a9d50994d1f Oct 08 14:42:17 crc kubenswrapper[4735]: I1008 14:42:17.284381 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" event={"ID":"d1e38574-c345-4fb1-b368-0499eeea2f3f","Type":"ContainerStarted","Data":"5e900617972e620e491d92f70e1c9d0ec965b4e55800e1d8f0d17a9d50994d1f"} Oct 08 14:42:20 crc kubenswrapper[4735]: I1008 14:42:20.198042 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d6b45bb5c-54t6j" Oct 08 14:42:20 crc kubenswrapper[4735]: I1008 14:42:20.254172 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd999549-2vdhk"] Oct 08 14:42:20 crc kubenswrapper[4735]: I1008 14:42:20.254662 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="dnsmasq-dns" containerID="cri-o://bf947012121cd178b1a4dcc2cbbfb5d1fc3c60ad9eef9e09276e525452a3afe0" gracePeriod=10 Oct 08 14:42:21 crc kubenswrapper[4735]: I1008 14:42:21.339050 4735 generic.go:334] "Generic (PLEG): container finished" podID="315103cc-5644-48ae-9409-9109d4c3d092" containerID="bf947012121cd178b1a4dcc2cbbfb5d1fc3c60ad9eef9e09276e525452a3afe0" exitCode=0 Oct 08 14:42:21 crc kubenswrapper[4735]: I1008 14:42:21.339098 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" event={"ID":"315103cc-5644-48ae-9409-9109d4c3d092","Type":"ContainerDied","Data":"bf947012121cd178b1a4dcc2cbbfb5d1fc3c60ad9eef9e09276e525452a3afe0"} Oct 08 14:42:24 crc kubenswrapper[4735]: I1008 14:42:24.642589 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.147:5353: connect: connection refused" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.208005 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.312355 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-config\") pod \"315103cc-5644-48ae-9409-9109d4c3d092\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.312587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz26d\" (UniqueName: \"kubernetes.io/projected/315103cc-5644-48ae-9409-9109d4c3d092-kube-api-access-cz26d\") pod \"315103cc-5644-48ae-9409-9109d4c3d092\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.312660 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-openstack-cell1\") pod \"315103cc-5644-48ae-9409-9109d4c3d092\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.312752 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-sb\") pod \"315103cc-5644-48ae-9409-9109d4c3d092\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.312864 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-dns-svc\") pod \"315103cc-5644-48ae-9409-9109d4c3d092\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.312994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-nb\") pod \"315103cc-5644-48ae-9409-9109d4c3d092\" (UID: \"315103cc-5644-48ae-9409-9109d4c3d092\") " Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.317036 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315103cc-5644-48ae-9409-9109d4c3d092-kube-api-access-cz26d" (OuterVolumeSpecName: "kube-api-access-cz26d") pod "315103cc-5644-48ae-9409-9109d4c3d092" (UID: "315103cc-5644-48ae-9409-9109d4c3d092"). InnerVolumeSpecName "kube-api-access-cz26d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.372345 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-config" (OuterVolumeSpecName: "config") pod "315103cc-5644-48ae-9409-9109d4c3d092" (UID: "315103cc-5644-48ae-9409-9109d4c3d092"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.373646 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "315103cc-5644-48ae-9409-9109d4c3d092" (UID: "315103cc-5644-48ae-9409-9109d4c3d092"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.378315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "315103cc-5644-48ae-9409-9109d4c3d092" (UID: "315103cc-5644-48ae-9409-9109d4c3d092"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.379971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "315103cc-5644-48ae-9409-9109d4c3d092" (UID: "315103cc-5644-48ae-9409-9109d4c3d092"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.387818 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "315103cc-5644-48ae-9409-9109d4c3d092" (UID: "315103cc-5644-48ae-9409-9109d4c3d092"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.395776 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" event={"ID":"315103cc-5644-48ae-9409-9109d4c3d092","Type":"ContainerDied","Data":"5ecdaa83e46529e2cae16ff6cb5c835c94983c26fed9369a764cd92591bd788a"} Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.396218 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd999549-2vdhk" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.396366 4735 scope.go:117] "RemoveContainer" containerID="bf947012121cd178b1a4dcc2cbbfb5d1fc3c60ad9eef9e09276e525452a3afe0" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.399720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" event={"ID":"d1e38574-c345-4fb1-b368-0499eeea2f3f","Type":"ContainerStarted","Data":"c62cfc83e083a9065eded53bfaa1d7fd4d6231d2137c7bbbf738de94ad8e3638"} Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.416618 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.416650 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.416663 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-config\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.416674 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz26d\" (UniqueName: \"kubernetes.io/projected/315103cc-5644-48ae-9409-9109d4c3d092-kube-api-access-cz26d\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.416686 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.416697 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/315103cc-5644-48ae-9409-9109d4c3d092-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.442503 4735 scope.go:117] "RemoveContainer" containerID="a464c5435aa556c8b5cb409bd40168bbc1dcc1e2b99d6713b75c4eb6261ba9ec" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.443802 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" podStartSLOduration=1.632787574 podStartE2EDuration="10.443784218s" podCreationTimestamp="2025-10-08 14:42:16 +0000 UTC" firstStartedPulling="2025-10-08 14:42:17.092274119 +0000 UTC m=+6386.900578663" lastFinishedPulling="2025-10-08 14:42:25.903270763 +0000 UTC m=+6395.711575307" observedRunningTime="2025-10-08 14:42:26.41612722 +0000 UTC m=+6396.224431764" watchObservedRunningTime="2025-10-08 14:42:26.443784218 +0000 UTC m=+6396.252088762" Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.467249 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd999549-2vdhk"] Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.478135 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd999549-2vdhk"] Oct 08 14:42:26 crc kubenswrapper[4735]: I1008 14:42:26.907834 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315103cc-5644-48ae-9409-9109d4c3d092" path="/var/lib/kubelet/pods/315103cc-5644-48ae-9409-9109d4c3d092/volumes" Oct 08 14:42:39 crc kubenswrapper[4735]: I1008 14:42:39.531693 4735 generic.go:334] "Generic (PLEG): container finished" podID="d1e38574-c345-4fb1-b368-0499eeea2f3f" containerID="c62cfc83e083a9065eded53bfaa1d7fd4d6231d2137c7bbbf738de94ad8e3638" exitCode=0 Oct 08 14:42:39 crc kubenswrapper[4735]: I1008 14:42:39.531812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" event={"ID":"d1e38574-c345-4fb1-b368-0499eeea2f3f","Type":"ContainerDied","Data":"c62cfc83e083a9065eded53bfaa1d7fd4d6231d2137c7bbbf738de94ad8e3638"} Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.063292 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.135211 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-pre-adoption-validation-combined-ca-bundle\") pod \"d1e38574-c345-4fb1-b368-0499eeea2f3f\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.135557 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrxw\" (UniqueName: \"kubernetes.io/projected/d1e38574-c345-4fb1-b368-0499eeea2f3f-kube-api-access-rvrxw\") pod \"d1e38574-c345-4fb1-b368-0499eeea2f3f\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.135704 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ssh-key\") pod \"d1e38574-c345-4fb1-b368-0499eeea2f3f\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.135827 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-inventory\") pod \"d1e38574-c345-4fb1-b368-0499eeea2f3f\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.135929 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ceph\") pod \"d1e38574-c345-4fb1-b368-0499eeea2f3f\" (UID: \"d1e38574-c345-4fb1-b368-0499eeea2f3f\") " Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.145687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "d1e38574-c345-4fb1-b368-0499eeea2f3f" (UID: "d1e38574-c345-4fb1-b368-0499eeea2f3f"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.146370 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e38574-c345-4fb1-b368-0499eeea2f3f-kube-api-access-rvrxw" (OuterVolumeSpecName: "kube-api-access-rvrxw") pod "d1e38574-c345-4fb1-b368-0499eeea2f3f" (UID: "d1e38574-c345-4fb1-b368-0499eeea2f3f"). InnerVolumeSpecName "kube-api-access-rvrxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.146940 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ceph" (OuterVolumeSpecName: "ceph") pod "d1e38574-c345-4fb1-b368-0499eeea2f3f" (UID: "d1e38574-c345-4fb1-b368-0499eeea2f3f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.205643 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-inventory" (OuterVolumeSpecName: "inventory") pod "d1e38574-c345-4fb1-b368-0499eeea2f3f" (UID: "d1e38574-c345-4fb1-b368-0499eeea2f3f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.217738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1e38574-c345-4fb1-b368-0499eeea2f3f" (UID: "d1e38574-c345-4fb1-b368-0499eeea2f3f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.239399 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvrxw\" (UniqueName: \"kubernetes.io/projected/d1e38574-c345-4fb1-b368-0499eeea2f3f-kube-api-access-rvrxw\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.239457 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.239467 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.239494 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.239504 4735 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e38574-c345-4fb1-b368-0499eeea2f3f-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.554236 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" event={"ID":"d1e38574-c345-4fb1-b368-0499eeea2f3f","Type":"ContainerDied","Data":"5e900617972e620e491d92f70e1c9d0ec965b4e55800e1d8f0d17a9d50994d1f"} Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.554283 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e900617972e620e491d92f70e1c9d0ec965b4e55800e1d8f0d17a9d50994d1f" Oct 08 14:42:41 crc kubenswrapper[4735]: I1008 14:42:41.554346 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.564409 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4"] Oct 08 14:42:43 crc kubenswrapper[4735]: E1008 14:42:43.565530 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="dnsmasq-dns" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.565549 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="dnsmasq-dns" Oct 08 14:42:43 crc kubenswrapper[4735]: E1008 14:42:43.565570 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="init" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.565578 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="init" Oct 08 14:42:43 crc kubenswrapper[4735]: E1008 14:42:43.565596 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e38574-c345-4fb1-b368-0499eeea2f3f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.565605 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e38574-c345-4fb1-b368-0499eeea2f3f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.565878 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e38574-c345-4fb1-b368-0499eeea2f3f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.565905 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="315103cc-5644-48ae-9409-9109d4c3d092" containerName="dnsmasq-dns" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.566798 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.570331 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xccq5" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.570637 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.570766 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.571827 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.593108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.593161 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.593231 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.593334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.593386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx9wr\" (UniqueName: \"kubernetes.io/projected/7124bfbc-c048-4d7a-be43-049e5bd6b16b-kube-api-access-gx9wr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.594239 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4"] Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.695205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.695273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx9wr\" (UniqueName: \"kubernetes.io/projected/7124bfbc-c048-4d7a-be43-049e5bd6b16b-kube-api-access-gx9wr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.695349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.695377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.695429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.702080 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.702160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.702434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.706204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.715282 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx9wr\" (UniqueName: \"kubernetes.io/projected/7124bfbc-c048-4d7a-be43-049e5bd6b16b-kube-api-access-gx9wr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:43 crc kubenswrapper[4735]: I1008 14:42:43.911523 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:42:44 crc kubenswrapper[4735]: I1008 14:42:44.614895 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4"] Oct 08 14:42:45 crc kubenswrapper[4735]: I1008 14:42:45.602707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" event={"ID":"7124bfbc-c048-4d7a-be43-049e5bd6b16b","Type":"ContainerStarted","Data":"3a1ae13f580ae4046936b1b217900305ece5165e14f93b674d88f3dd752650b4"} Oct 08 14:42:46 crc kubenswrapper[4735]: I1008 14:42:46.613214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" event={"ID":"7124bfbc-c048-4d7a-be43-049e5bd6b16b","Type":"ContainerStarted","Data":"c79493f3916cc781a2ef7b1cad35c23861d0fed9d4ca658e2040029bb468657e"} Oct 08 14:42:46 crc kubenswrapper[4735]: I1008 14:42:46.637455 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" podStartSLOduration=3.137720303 podStartE2EDuration="3.637429765s" podCreationTimestamp="2025-10-08 14:42:43 +0000 UTC" firstStartedPulling="2025-10-08 14:42:44.617907827 +0000 UTC m=+6414.426212371" lastFinishedPulling="2025-10-08 14:42:45.117617289 +0000 UTC m=+6414.925921833" observedRunningTime="2025-10-08 14:42:46.628006664 +0000 UTC m=+6416.436311228" watchObservedRunningTime="2025-10-08 14:42:46.637429765 +0000 UTC m=+6416.445734309" Oct 08 14:43:13 crc kubenswrapper[4735]: I1008 14:43:13.445394 4735 scope.go:117] "RemoveContainer" containerID="b3b1ed0ec94eb2dddd47a1086884ee8d7a3fd7f7a15bd620067b0524eecbb276" Oct 08 14:43:13 crc kubenswrapper[4735]: I1008 14:43:13.474636 4735 scope.go:117] "RemoveContainer" containerID="ea3c674dd07a7a490244b1ba45ce7c391f1a68dab45cda4090844c5a69df20df" Oct 08 14:43:24 crc kubenswrapper[4735]: I1008 14:43:24.878535 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:43:24 crc kubenswrapper[4735]: I1008 14:43:24.879092 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:43:39 crc kubenswrapper[4735]: I1008 14:43:39.050139 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-p4mlz"] Oct 08 14:43:39 crc kubenswrapper[4735]: I1008 14:43:39.059956 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-p4mlz"] Oct 08 14:43:40 crc kubenswrapper[4735]: I1008 14:43:40.908459 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b98b6a4-048b-4af1-9661-3a99bc46599e" path="/var/lib/kubelet/pods/0b98b6a4-048b-4af1-9661-3a99bc46599e/volumes" Oct 08 14:43:52 crc kubenswrapper[4735]: I1008 14:43:52.049172 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-5fa0-account-create-m4dtb"] Oct 08 14:43:52 crc kubenswrapper[4735]: I1008 14:43:52.055812 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-5fa0-account-create-m4dtb"] Oct 08 14:43:52 crc kubenswrapper[4735]: I1008 14:43:52.908989 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61c0f13-2922-426f-a4e7-f88721437e91" path="/var/lib/kubelet/pods/b61c0f13-2922-426f-a4e7-f88721437e91/volumes" Oct 08 14:43:54 crc kubenswrapper[4735]: I1008 14:43:54.879165 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:43:54 crc kubenswrapper[4735]: I1008 14:43:54.880655 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:43:58 crc kubenswrapper[4735]: I1008 14:43:58.030868 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-g28rx"] Oct 08 14:43:58 crc kubenswrapper[4735]: I1008 14:43:58.042289 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-g28rx"] Oct 08 14:43:58 crc kubenswrapper[4735]: I1008 14:43:58.909284 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6310b4-41f4-4bbe-b12f-cad91f7e549a" path="/var/lib/kubelet/pods/ea6310b4-41f4-4bbe-b12f-cad91f7e549a/volumes" Oct 08 14:44:09 crc kubenswrapper[4735]: I1008 14:44:09.039560 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-ac36-account-create-6fs96"] Oct 08 14:44:09 crc kubenswrapper[4735]: I1008 14:44:09.051096 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-ac36-account-create-6fs96"] Oct 08 14:44:10 crc kubenswrapper[4735]: I1008 14:44:10.908039 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c567dd-7884-4d1b-aa0b-6f1c14704612" path="/var/lib/kubelet/pods/48c567dd-7884-4d1b-aa0b-6f1c14704612/volumes" Oct 08 14:44:13 crc kubenswrapper[4735]: I1008 14:44:13.773931 4735 scope.go:117] "RemoveContainer" containerID="31582749a417ae53c1c895566ac150b241900bbdc069db02fa3fc4230d497a72" Oct 08 14:44:13 crc kubenswrapper[4735]: I1008 14:44:13.824018 4735 scope.go:117] "RemoveContainer" containerID="86e5922a0d217fc9d4d9d64965ed5bc8690c63ce590f8bfd81745372811af5a0" Oct 08 14:44:13 crc kubenswrapper[4735]: I1008 14:44:13.851988 4735 scope.go:117] "RemoveContainer" containerID="ed1432aa81a6b69b15bd099f9048535fcef2c81fd832c21ce73435b398b00d2e" Oct 08 14:44:13 crc kubenswrapper[4735]: I1008 14:44:13.903980 4735 scope.go:117] "RemoveContainer" containerID="3dcc0a917e17d897f3e17f6a7be8577f9e3f2be9dd1fb73345313bff9ac5117b" Oct 08 14:44:24 crc kubenswrapper[4735]: I1008 14:44:24.878224 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:44:24 crc kubenswrapper[4735]: I1008 14:44:24.878820 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:44:24 crc kubenswrapper[4735]: I1008 14:44:24.878879 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:44:24 crc kubenswrapper[4735]: I1008 14:44:24.879762 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb4a9e663866841ddb83c52bf2f4a989212e0d3306474a6bb87200f186570090"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:44:24 crc kubenswrapper[4735]: I1008 14:44:24.879821 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://fb4a9e663866841ddb83c52bf2f4a989212e0d3306474a6bb87200f186570090" gracePeriod=600 Oct 08 14:44:25 crc kubenswrapper[4735]: I1008 14:44:25.603439 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="fb4a9e663866841ddb83c52bf2f4a989212e0d3306474a6bb87200f186570090" exitCode=0 Oct 08 14:44:25 crc kubenswrapper[4735]: I1008 14:44:25.603521 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"fb4a9e663866841ddb83c52bf2f4a989212e0d3306474a6bb87200f186570090"} Oct 08 14:44:25 crc kubenswrapper[4735]: I1008 14:44:25.603885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56"} Oct 08 14:44:25 crc kubenswrapper[4735]: I1008 14:44:25.603930 4735 scope.go:117] "RemoveContainer" containerID="0ef7ce1bb2f88774957446f907a55957b94dad288444fb6f22993a47ba31f8b0" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.174751 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5"] Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.177267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.181833 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.182255 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.186192 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5"] Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.278807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af5760c6-c82f-4d64-9889-207c093a6466-config-volume\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.278869 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af5760c6-c82f-4d64-9889-207c093a6466-secret-volume\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.278958 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg6cw\" (UniqueName: \"kubernetes.io/projected/af5760c6-c82f-4d64-9889-207c093a6466-kube-api-access-mg6cw\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.381725 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af5760c6-c82f-4d64-9889-207c093a6466-config-volume\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.381804 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af5760c6-c82f-4d64-9889-207c093a6466-secret-volume\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.381900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg6cw\" (UniqueName: \"kubernetes.io/projected/af5760c6-c82f-4d64-9889-207c093a6466-kube-api-access-mg6cw\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.382586 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af5760c6-c82f-4d64-9889-207c093a6466-config-volume\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.387987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af5760c6-c82f-4d64-9889-207c093a6466-secret-volume\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.398849 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg6cw\" (UniqueName: \"kubernetes.io/projected/af5760c6-c82f-4d64-9889-207c093a6466-kube-api-access-mg6cw\") pod \"collect-profiles-29332245-vllf5\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.509730 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:00 crc kubenswrapper[4735]: I1008 14:45:00.979297 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5"] Oct 08 14:45:01 crc kubenswrapper[4735]: I1008 14:45:01.005387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" event={"ID":"af5760c6-c82f-4d64-9889-207c093a6466","Type":"ContainerStarted","Data":"2f5768a9773dcd0e0c3fc8e07e8892220a71157091e613018369f821e1f5ca11"} Oct 08 14:45:02 crc kubenswrapper[4735]: I1008 14:45:02.017345 4735 generic.go:334] "Generic (PLEG): container finished" podID="af5760c6-c82f-4d64-9889-207c093a6466" containerID="3c306de21ab0536a3d3430fdbdca837cff550a1f5e3a66e1c9a6ab59b78ce73c" exitCode=0 Oct 08 14:45:02 crc kubenswrapper[4735]: I1008 14:45:02.017430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" event={"ID":"af5760c6-c82f-4d64-9889-207c093a6466","Type":"ContainerDied","Data":"3c306de21ab0536a3d3430fdbdca837cff550a1f5e3a66e1c9a6ab59b78ce73c"} Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.378158 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.556867 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af5760c6-c82f-4d64-9889-207c093a6466-config-volume\") pod \"af5760c6-c82f-4d64-9889-207c093a6466\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.557321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg6cw\" (UniqueName: \"kubernetes.io/projected/af5760c6-c82f-4d64-9889-207c093a6466-kube-api-access-mg6cw\") pod \"af5760c6-c82f-4d64-9889-207c093a6466\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.557425 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af5760c6-c82f-4d64-9889-207c093a6466-secret-volume\") pod \"af5760c6-c82f-4d64-9889-207c093a6466\" (UID: \"af5760c6-c82f-4d64-9889-207c093a6466\") " Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.558313 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5760c6-c82f-4d64-9889-207c093a6466-config-volume" (OuterVolumeSpecName: "config-volume") pod "af5760c6-c82f-4d64-9889-207c093a6466" (UID: "af5760c6-c82f-4d64-9889-207c093a6466"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.558961 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af5760c6-c82f-4d64-9889-207c093a6466-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.564012 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5760c6-c82f-4d64-9889-207c093a6466-kube-api-access-mg6cw" (OuterVolumeSpecName: "kube-api-access-mg6cw") pod "af5760c6-c82f-4d64-9889-207c093a6466" (UID: "af5760c6-c82f-4d64-9889-207c093a6466"). InnerVolumeSpecName "kube-api-access-mg6cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.567815 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5760c6-c82f-4d64-9889-207c093a6466-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af5760c6-c82f-4d64-9889-207c093a6466" (UID: "af5760c6-c82f-4d64-9889-207c093a6466"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.662810 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg6cw\" (UniqueName: \"kubernetes.io/projected/af5760c6-c82f-4d64-9889-207c093a6466-kube-api-access-mg6cw\") on node \"crc\" DevicePath \"\"" Oct 08 14:45:03 crc kubenswrapper[4735]: I1008 14:45:03.662849 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af5760c6-c82f-4d64-9889-207c093a6466-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 14:45:04 crc kubenswrapper[4735]: I1008 14:45:04.038425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" event={"ID":"af5760c6-c82f-4d64-9889-207c093a6466","Type":"ContainerDied","Data":"2f5768a9773dcd0e0c3fc8e07e8892220a71157091e613018369f821e1f5ca11"} Oct 08 14:45:04 crc kubenswrapper[4735]: I1008 14:45:04.038469 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f5768a9773dcd0e0c3fc8e07e8892220a71157091e613018369f821e1f5ca11" Oct 08 14:45:04 crc kubenswrapper[4735]: I1008 14:45:04.038541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332245-vllf5" Oct 08 14:45:04 crc kubenswrapper[4735]: I1008 14:45:04.452081 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq"] Oct 08 14:45:04 crc kubenswrapper[4735]: I1008 14:45:04.462864 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332200-s77gq"] Oct 08 14:45:04 crc kubenswrapper[4735]: I1008 14:45:04.911779 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36157948-f34a-4697-91c1-09f731003616" path="/var/lib/kubelet/pods/36157948-f34a-4697-91c1-09f731003616/volumes" Oct 08 14:45:07 crc kubenswrapper[4735]: I1008 14:45:07.051650 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-kdqg9"] Oct 08 14:45:07 crc kubenswrapper[4735]: I1008 14:45:07.059150 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-kdqg9"] Oct 08 14:45:08 crc kubenswrapper[4735]: I1008 14:45:08.910427 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f809f1-c832-4c74-babb-397dce1a02e1" path="/var/lib/kubelet/pods/45f809f1-c832-4c74-babb-397dce1a02e1/volumes" Oct 08 14:45:14 crc kubenswrapper[4735]: I1008 14:45:14.040102 4735 scope.go:117] "RemoveContainer" containerID="463c9555e2173c60b80f50c599b3f4f1ef26920c900a0893e0472e7225a9f48d" Oct 08 14:45:14 crc kubenswrapper[4735]: I1008 14:45:14.068594 4735 scope.go:117] "RemoveContainer" containerID="6e01e96bcd030e2576793cec38d354ec3a2f7c8daa126ac0c997692a5be0f14c" Oct 08 14:45:14 crc kubenswrapper[4735]: I1008 14:45:14.128242 4735 scope.go:117] "RemoveContainer" containerID="bb283f4a3d845ed4841fe92b53719f217593684919e9e9447fcf8527a515607e" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.516998 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fwp8"] Oct 08 14:45:19 crc kubenswrapper[4735]: E1008 14:45:19.517925 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5760c6-c82f-4d64-9889-207c093a6466" containerName="collect-profiles" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.517938 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5760c6-c82f-4d64-9889-207c093a6466" containerName="collect-profiles" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.518164 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5760c6-c82f-4d64-9889-207c093a6466" containerName="collect-profiles" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.524166 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.527124 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fwp8"] Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.717238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-catalog-content\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.717309 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-utilities\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.717375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd5n8\" (UniqueName: \"kubernetes.io/projected/c52995da-420c-42bc-8889-01f40148da0c-kube-api-access-rd5n8\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.819160 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-catalog-content\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.819263 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-utilities\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.819310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd5n8\" (UniqueName: \"kubernetes.io/projected/c52995da-420c-42bc-8889-01f40148da0c-kube-api-access-rd5n8\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.819890 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-utilities\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.819913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-catalog-content\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:19 crc kubenswrapper[4735]: I1008 14:45:19.867682 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd5n8\" (UniqueName: \"kubernetes.io/projected/c52995da-420c-42bc-8889-01f40148da0c-kube-api-access-rd5n8\") pod \"community-operators-9fwp8\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:20 crc kubenswrapper[4735]: I1008 14:45:20.150046 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:20 crc kubenswrapper[4735]: I1008 14:45:20.631716 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fwp8"] Oct 08 14:45:21 crc kubenswrapper[4735]: I1008 14:45:21.213742 4735 generic.go:334] "Generic (PLEG): container finished" podID="c52995da-420c-42bc-8889-01f40148da0c" containerID="f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b" exitCode=0 Oct 08 14:45:21 crc kubenswrapper[4735]: I1008 14:45:21.213827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerDied","Data":"f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b"} Oct 08 14:45:21 crc kubenswrapper[4735]: I1008 14:45:21.214445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerStarted","Data":"e076c6a46a868a407c730088b78fe9a83caabc1773fed4a2f1f27e62049a30fb"} Oct 08 14:45:21 crc kubenswrapper[4735]: I1008 14:45:21.216268 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:45:23 crc kubenswrapper[4735]: I1008 14:45:23.232190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerStarted","Data":"2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8"} Oct 08 14:45:24 crc kubenswrapper[4735]: I1008 14:45:24.242871 4735 generic.go:334] "Generic (PLEG): container finished" podID="c52995da-420c-42bc-8889-01f40148da0c" containerID="2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8" exitCode=0 Oct 08 14:45:24 crc kubenswrapper[4735]: I1008 14:45:24.242937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerDied","Data":"2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8"} Oct 08 14:45:25 crc kubenswrapper[4735]: I1008 14:45:25.255740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerStarted","Data":"3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd"} Oct 08 14:45:25 crc kubenswrapper[4735]: I1008 14:45:25.281308 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fwp8" podStartSLOduration=2.789181108 podStartE2EDuration="6.281286452s" podCreationTimestamp="2025-10-08 14:45:19 +0000 UTC" firstStartedPulling="2025-10-08 14:45:21.215793253 +0000 UTC m=+6571.024097807" lastFinishedPulling="2025-10-08 14:45:24.707898607 +0000 UTC m=+6574.516203151" observedRunningTime="2025-10-08 14:45:25.27220854 +0000 UTC m=+6575.080513094" watchObservedRunningTime="2025-10-08 14:45:25.281286452 +0000 UTC m=+6575.089590996" Oct 08 14:45:30 crc kubenswrapper[4735]: I1008 14:45:30.151900 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:30 crc kubenswrapper[4735]: I1008 14:45:30.152548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:30 crc kubenswrapper[4735]: I1008 14:45:30.201082 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:30 crc kubenswrapper[4735]: I1008 14:45:30.353190 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:30 crc kubenswrapper[4735]: I1008 14:45:30.437746 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fwp8"] Oct 08 14:45:32 crc kubenswrapper[4735]: I1008 14:45:32.324198 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9fwp8" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="registry-server" containerID="cri-o://3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd" gracePeriod=2 Oct 08 14:45:32 crc kubenswrapper[4735]: I1008 14:45:32.869460 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.007316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-catalog-content\") pod \"c52995da-420c-42bc-8889-01f40148da0c\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.007530 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-utilities\") pod \"c52995da-420c-42bc-8889-01f40148da0c\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.007580 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd5n8\" (UniqueName: \"kubernetes.io/projected/c52995da-420c-42bc-8889-01f40148da0c-kube-api-access-rd5n8\") pod \"c52995da-420c-42bc-8889-01f40148da0c\" (UID: \"c52995da-420c-42bc-8889-01f40148da0c\") " Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.008890 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-utilities" (OuterVolumeSpecName: "utilities") pod "c52995da-420c-42bc-8889-01f40148da0c" (UID: "c52995da-420c-42bc-8889-01f40148da0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.009816 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.028043 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52995da-420c-42bc-8889-01f40148da0c-kube-api-access-rd5n8" (OuterVolumeSpecName: "kube-api-access-rd5n8") pod "c52995da-420c-42bc-8889-01f40148da0c" (UID: "c52995da-420c-42bc-8889-01f40148da0c"). InnerVolumeSpecName "kube-api-access-rd5n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.064625 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c52995da-420c-42bc-8889-01f40148da0c" (UID: "c52995da-420c-42bc-8889-01f40148da0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.111350 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd5n8\" (UniqueName: \"kubernetes.io/projected/c52995da-420c-42bc-8889-01f40148da0c-kube-api-access-rd5n8\") on node \"crc\" DevicePath \"\"" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.111647 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52995da-420c-42bc-8889-01f40148da0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.335813 4735 generic.go:334] "Generic (PLEG): container finished" podID="c52995da-420c-42bc-8889-01f40148da0c" containerID="3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd" exitCode=0 Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.335854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerDied","Data":"3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd"} Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.335869 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fwp8" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.335883 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fwp8" event={"ID":"c52995da-420c-42bc-8889-01f40148da0c","Type":"ContainerDied","Data":"e076c6a46a868a407c730088b78fe9a83caabc1773fed4a2f1f27e62049a30fb"} Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.335901 4735 scope.go:117] "RemoveContainer" containerID="3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.363962 4735 scope.go:117] "RemoveContainer" containerID="2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.374159 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fwp8"] Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.385069 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9fwp8"] Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.407071 4735 scope.go:117] "RemoveContainer" containerID="f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.439161 4735 scope.go:117] "RemoveContainer" containerID="3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd" Oct 08 14:45:33 crc kubenswrapper[4735]: E1008 14:45:33.439888 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd\": container with ID starting with 3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd not found: ID does not exist" containerID="3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.439949 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd"} err="failed to get container status \"3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd\": rpc error: code = NotFound desc = could not find container \"3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd\": container with ID starting with 3c3f2d3385739444bb83322f70a2f661eaa59ae68916b86140b70c2e4cf80edd not found: ID does not exist" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.439989 4735 scope.go:117] "RemoveContainer" containerID="2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8" Oct 08 14:45:33 crc kubenswrapper[4735]: E1008 14:45:33.440550 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8\": container with ID starting with 2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8 not found: ID does not exist" containerID="2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.440595 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8"} err="failed to get container status \"2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8\": rpc error: code = NotFound desc = could not find container \"2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8\": container with ID starting with 2df702d8d1366f7a7ee1f8530b036cdaa459299efafbe036f0ec03a6586669a8 not found: ID does not exist" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.440628 4735 scope.go:117] "RemoveContainer" containerID="f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b" Oct 08 14:45:33 crc kubenswrapper[4735]: E1008 14:45:33.440954 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b\": container with ID starting with f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b not found: ID does not exist" containerID="f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b" Oct 08 14:45:33 crc kubenswrapper[4735]: I1008 14:45:33.440989 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b"} err="failed to get container status \"f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b\": rpc error: code = NotFound desc = could not find container \"f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b\": container with ID starting with f1a6f1c67495d1ec2a772266cb42f8f0b55b84df2b0ccb0d8274b66b5929624b not found: ID does not exist" Oct 08 14:45:34 crc kubenswrapper[4735]: I1008 14:45:34.909371 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52995da-420c-42bc-8889-01f40148da0c" path="/var/lib/kubelet/pods/c52995da-420c-42bc-8889-01f40148da0c/volumes" Oct 08 14:46:54 crc kubenswrapper[4735]: I1008 14:46:54.878983 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:46:54 crc kubenswrapper[4735]: I1008 14:46:54.879612 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:47:03 crc kubenswrapper[4735]: I1008 14:47:03.044453 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-pz79f"] Oct 08 14:47:03 crc kubenswrapper[4735]: I1008 14:47:03.057581 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-pz79f"] Oct 08 14:47:04 crc kubenswrapper[4735]: I1008 14:47:04.909770 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a1cc9c-ece5-4ac2-b809-31be10f2828e" path="/var/lib/kubelet/pods/f6a1cc9c-ece5-4ac2-b809-31be10f2828e/volumes" Oct 08 14:47:13 crc kubenswrapper[4735]: I1008 14:47:13.039162 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-621e-account-create-8qpk6"] Oct 08 14:47:13 crc kubenswrapper[4735]: I1008 14:47:13.048600 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-621e-account-create-8qpk6"] Oct 08 14:47:14 crc kubenswrapper[4735]: I1008 14:47:14.287235 4735 scope.go:117] "RemoveContainer" containerID="857e21b3a89ae6fda3cbf59889d3327a6d657b92b0a11539350cbc79efe82fdf" Oct 08 14:47:14 crc kubenswrapper[4735]: I1008 14:47:14.908511 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01938786-d5c0-40ba-857f-edcb7ae6477e" path="/var/lib/kubelet/pods/01938786-d5c0-40ba-857f-edcb7ae6477e/volumes" Oct 08 14:47:24 crc kubenswrapper[4735]: I1008 14:47:24.879229 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:47:24 crc kubenswrapper[4735]: I1008 14:47:24.879766 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:47:27 crc kubenswrapper[4735]: I1008 14:47:27.030624 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-ffzjx"] Oct 08 14:47:27 crc kubenswrapper[4735]: I1008 14:47:27.043791 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-ffzjx"] Oct 08 14:47:28 crc kubenswrapper[4735]: I1008 14:47:28.907879 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab0b01a-9c11-478e-825f-c2d2fc24b9e8" path="/var/lib/kubelet/pods/9ab0b01a-9c11-478e-825f-c2d2fc24b9e8/volumes" Oct 08 14:47:54 crc kubenswrapper[4735]: I1008 14:47:54.878521 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:47:54 crc kubenswrapper[4735]: I1008 14:47:54.879078 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:47:54 crc kubenswrapper[4735]: I1008 14:47:54.879135 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:47:54 crc kubenswrapper[4735]: I1008 14:47:54.879969 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:47:54 crc kubenswrapper[4735]: I1008 14:47:54.880021 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" gracePeriod=600 Oct 08 14:47:55 crc kubenswrapper[4735]: E1008 14:47:55.004846 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:47:55 crc kubenswrapper[4735]: I1008 14:47:55.769727 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" exitCode=0 Oct 08 14:47:55 crc kubenswrapper[4735]: I1008 14:47:55.769807 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56"} Oct 08 14:47:55 crc kubenswrapper[4735]: I1008 14:47:55.770073 4735 scope.go:117] "RemoveContainer" containerID="fb4a9e663866841ddb83c52bf2f4a989212e0d3306474a6bb87200f186570090" Oct 08 14:47:55 crc kubenswrapper[4735]: I1008 14:47:55.770764 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:47:55 crc kubenswrapper[4735]: E1008 14:47:55.771161 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:48:06 crc kubenswrapper[4735]: I1008 14:48:06.894757 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:48:06 crc kubenswrapper[4735]: E1008 14:48:06.895519 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:48:14 crc kubenswrapper[4735]: I1008 14:48:14.342093 4735 scope.go:117] "RemoveContainer" containerID="726a5619111e4b895e246ae0b2a93c8d3e781779c317a92f79b7d83812170e16" Oct 08 14:48:14 crc kubenswrapper[4735]: I1008 14:48:14.411999 4735 scope.go:117] "RemoveContainer" containerID="fa3ea6651eae869a0462a13e01adee05d3815d3d5e2ac8c21692941c916b27e1" Oct 08 14:48:17 crc kubenswrapper[4735]: I1008 14:48:17.895093 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:48:17 crc kubenswrapper[4735]: E1008 14:48:17.896023 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:48:28 crc kubenswrapper[4735]: I1008 14:48:28.895063 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:48:28 crc kubenswrapper[4735]: E1008 14:48:28.896079 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:48:41 crc kubenswrapper[4735]: I1008 14:48:41.894774 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:48:41 crc kubenswrapper[4735]: E1008 14:48:41.895431 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:48:56 crc kubenswrapper[4735]: I1008 14:48:56.895719 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:48:56 crc kubenswrapper[4735]: E1008 14:48:56.896565 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:49:11 crc kubenswrapper[4735]: I1008 14:49:11.895148 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:49:11 crc kubenswrapper[4735]: E1008 14:49:11.896058 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:49:22 crc kubenswrapper[4735]: I1008 14:49:22.899359 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:49:22 crc kubenswrapper[4735]: E1008 14:49:22.900072 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:49:33 crc kubenswrapper[4735]: I1008 14:49:33.894808 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:49:33 crc kubenswrapper[4735]: E1008 14:49:33.895579 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:49:36 crc kubenswrapper[4735]: I1008 14:49:36.037408 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-vphsj"] Oct 08 14:49:36 crc kubenswrapper[4735]: I1008 14:49:36.048445 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-vphsj"] Oct 08 14:49:36 crc kubenswrapper[4735]: I1008 14:49:36.909505 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b44fabde-cb8a-49d6-baaf-5b77929fa58b" path="/var/lib/kubelet/pods/b44fabde-cb8a-49d6-baaf-5b77929fa58b/volumes" Oct 08 14:49:46 crc kubenswrapper[4735]: I1008 14:49:46.073402 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-a871-account-create-cvwbz"] Oct 08 14:49:46 crc kubenswrapper[4735]: I1008 14:49:46.088245 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-a871-account-create-cvwbz"] Oct 08 14:49:46 crc kubenswrapper[4735]: I1008 14:49:46.895253 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:49:46 crc kubenswrapper[4735]: E1008 14:49:46.895908 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:49:46 crc kubenswrapper[4735]: I1008 14:49:46.908659 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a010e60-5c2d-494b-9f68-e98573cff37d" path="/var/lib/kubelet/pods/8a010e60-5c2d-494b-9f68-e98573cff37d/volumes" Oct 08 14:49:57 crc kubenswrapper[4735]: I1008 14:49:57.895151 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:49:57 crc kubenswrapper[4735]: E1008 14:49:57.896286 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:50:00 crc kubenswrapper[4735]: I1008 14:50:00.033563 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-bl2lv"] Oct 08 14:50:00 crc kubenswrapper[4735]: I1008 14:50:00.042792 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-bl2lv"] Oct 08 14:50:00 crc kubenswrapper[4735]: I1008 14:50:00.910004 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4399e6dc-523a-4a68-9b4a-aca82d1d861f" path="/var/lib/kubelet/pods/4399e6dc-523a-4a68-9b4a-aca82d1d861f/volumes" Oct 08 14:50:08 crc kubenswrapper[4735]: I1008 14:50:08.895293 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:50:08 crc kubenswrapper[4735]: E1008 14:50:08.896269 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:50:14 crc kubenswrapper[4735]: I1008 14:50:14.538666 4735 scope.go:117] "RemoveContainer" containerID="6a5a1704ef569beb35a5e69d76a08ebade4be51449a01a7142a7d19568140b05" Oct 08 14:50:14 crc kubenswrapper[4735]: I1008 14:50:14.563655 4735 scope.go:117] "RemoveContainer" containerID="d93cd0769eb93f9da7731dfefb9f8478b3527e196d42e5b8f373014c82a24abf" Oct 08 14:50:14 crc kubenswrapper[4735]: I1008 14:50:14.617947 4735 scope.go:117] "RemoveContainer" containerID="23d32286f38099704d0c517e48fb79fcb6ba4cd95ef97a4812f892c94b81d750" Oct 08 14:50:20 crc kubenswrapper[4735]: I1008 14:50:20.040379 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-7c5xp"] Oct 08 14:50:20 crc kubenswrapper[4735]: I1008 14:50:20.048854 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-7c5xp"] Oct 08 14:50:20 crc kubenswrapper[4735]: I1008 14:50:20.907625 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3750677-6a05-476a-8bc1-f5f0ff3cb574" path="/var/lib/kubelet/pods/e3750677-6a05-476a-8bc1-f5f0ff3cb574/volumes" Oct 08 14:50:23 crc kubenswrapper[4735]: I1008 14:50:23.894630 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:50:23 crc kubenswrapper[4735]: E1008 14:50:23.895983 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:50:30 crc kubenswrapper[4735]: I1008 14:50:30.032182 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-7cbf-account-create-kmczx"] Oct 08 14:50:30 crc kubenswrapper[4735]: I1008 14:50:30.053986 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-7cbf-account-create-kmczx"] Oct 08 14:50:30 crc kubenswrapper[4735]: I1008 14:50:30.909115 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707b9e23-4080-451d-818f-733ecfb23ec2" path="/var/lib/kubelet/pods/707b9e23-4080-451d-818f-733ecfb23ec2/volumes" Oct 08 14:50:36 crc kubenswrapper[4735]: I1008 14:50:36.895170 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:50:36 crc kubenswrapper[4735]: E1008 14:50:36.896012 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:50:43 crc kubenswrapper[4735]: I1008 14:50:43.046817 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-br9l6"] Oct 08 14:50:43 crc kubenswrapper[4735]: I1008 14:50:43.063593 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-br9l6"] Oct 08 14:50:44 crc kubenswrapper[4735]: I1008 14:50:44.909199 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8" path="/var/lib/kubelet/pods/c2bf29d2-6abe-4b3b-88d2-c2759e07c4d8/volumes" Oct 08 14:50:48 crc kubenswrapper[4735]: I1008 14:50:48.895911 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:50:48 crc kubenswrapper[4735]: E1008 14:50:48.896810 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:51:02 crc kubenswrapper[4735]: I1008 14:51:02.896078 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:51:02 crc kubenswrapper[4735]: E1008 14:51:02.898011 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:51:13 crc kubenswrapper[4735]: I1008 14:51:13.895622 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:51:13 crc kubenswrapper[4735]: E1008 14:51:13.896390 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:51:14 crc kubenswrapper[4735]: I1008 14:51:14.727767 4735 scope.go:117] "RemoveContainer" containerID="ec73cb8e9a729d6ee83f9b366ab3c979db7291587c27a1e60caa46d81d77d796" Oct 08 14:51:14 crc kubenswrapper[4735]: I1008 14:51:14.760624 4735 scope.go:117] "RemoveContainer" containerID="aa7bb50eb227d04309efc533ca2a2ec2f81b7db2faa2bbc3c9530210fcebb929" Oct 08 14:51:14 crc kubenswrapper[4735]: I1008 14:51:14.807402 4735 scope.go:117] "RemoveContainer" containerID="6505f6c8de497edeb92e46f3deb531f3c388425dac5c431b084962c1bc6c3950" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.217925 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l9gn6"] Oct 08 14:51:15 crc kubenswrapper[4735]: E1008 14:51:15.219757 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="extract-utilities" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.219789 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="extract-utilities" Oct 08 14:51:15 crc kubenswrapper[4735]: E1008 14:51:15.219835 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="extract-content" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.219846 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="extract-content" Oct 08 14:51:15 crc kubenswrapper[4735]: E1008 14:51:15.219864 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="registry-server" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.219872 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="registry-server" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.220634 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52995da-420c-42bc-8889-01f40148da0c" containerName="registry-server" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.225423 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.260400 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9gn6"] Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.307347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-utilities\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.307578 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk59h\" (UniqueName: \"kubernetes.io/projected/743d9187-1a75-4cad-8993-700240941d9c-kube-api-access-rk59h\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.307616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-catalog-content\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.409870 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk59h\" (UniqueName: \"kubernetes.io/projected/743d9187-1a75-4cad-8993-700240941d9c-kube-api-access-rk59h\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.409929 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-catalog-content\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.410052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-utilities\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.410504 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-catalog-content\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.410532 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-utilities\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.447496 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk59h\" (UniqueName: \"kubernetes.io/projected/743d9187-1a75-4cad-8993-700240941d9c-kube-api-access-rk59h\") pod \"redhat-operators-l9gn6\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:15 crc kubenswrapper[4735]: I1008 14:51:15.562428 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:16 crc kubenswrapper[4735]: I1008 14:51:16.061058 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9gn6"] Oct 08 14:51:16 crc kubenswrapper[4735]: I1008 14:51:16.716534 4735 generic.go:334] "Generic (PLEG): container finished" podID="743d9187-1a75-4cad-8993-700240941d9c" containerID="9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc" exitCode=0 Oct 08 14:51:16 crc kubenswrapper[4735]: I1008 14:51:16.716617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerDied","Data":"9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc"} Oct 08 14:51:16 crc kubenswrapper[4735]: I1008 14:51:16.716857 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerStarted","Data":"e665195cf3d624fb7654eea4c82b9d2b8ea9255b3d3d816dfccb8e3c3b47071d"} Oct 08 14:51:16 crc kubenswrapper[4735]: I1008 14:51:16.719310 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:51:18 crc kubenswrapper[4735]: I1008 14:51:18.742342 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerStarted","Data":"93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0"} Oct 08 14:51:21 crc kubenswrapper[4735]: I1008 14:51:21.776954 4735 generic.go:334] "Generic (PLEG): container finished" podID="743d9187-1a75-4cad-8993-700240941d9c" containerID="93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0" exitCode=0 Oct 08 14:51:21 crc kubenswrapper[4735]: I1008 14:51:21.777018 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerDied","Data":"93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0"} Oct 08 14:51:22 crc kubenswrapper[4735]: I1008 14:51:22.787092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerStarted","Data":"534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff"} Oct 08 14:51:22 crc kubenswrapper[4735]: I1008 14:51:22.807236 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l9gn6" podStartSLOduration=2.201758952 podStartE2EDuration="7.807041249s" podCreationTimestamp="2025-10-08 14:51:15 +0000 UTC" firstStartedPulling="2025-10-08 14:51:16.718994164 +0000 UTC m=+6926.527298708" lastFinishedPulling="2025-10-08 14:51:22.324276461 +0000 UTC m=+6932.132581005" observedRunningTime="2025-10-08 14:51:22.804006492 +0000 UTC m=+6932.612311036" watchObservedRunningTime="2025-10-08 14:51:22.807041249 +0000 UTC m=+6932.615345793" Oct 08 14:51:24 crc kubenswrapper[4735]: I1008 14:51:24.895292 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:51:24 crc kubenswrapper[4735]: E1008 14:51:24.895675 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:51:25 crc kubenswrapper[4735]: I1008 14:51:25.563135 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:25 crc kubenswrapper[4735]: I1008 14:51:25.563444 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.314126 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7nx9"] Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.319134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.323776 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7nx9"] Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.447386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz6hf\" (UniqueName: \"kubernetes.io/projected/2a101e9b-760a-418b-9d59-67620afe8277-kube-api-access-fz6hf\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.447442 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-catalog-content\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.447544 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-utilities\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.550023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz6hf\" (UniqueName: \"kubernetes.io/projected/2a101e9b-760a-418b-9d59-67620afe8277-kube-api-access-fz6hf\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.550098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-catalog-content\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.550152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-utilities\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.550710 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-catalog-content\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.550789 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-utilities\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.575524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz6hf\" (UniqueName: \"kubernetes.io/projected/2a101e9b-760a-418b-9d59-67620afe8277-kube-api-access-fz6hf\") pod \"redhat-marketplace-k7nx9\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.619388 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9gn6" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="registry-server" probeResult="failure" output=< Oct 08 14:51:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:51:26 crc kubenswrapper[4735]: > Oct 08 14:51:26 crc kubenswrapper[4735]: I1008 14:51:26.642279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:27 crc kubenswrapper[4735]: I1008 14:51:27.131113 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7nx9"] Oct 08 14:51:27 crc kubenswrapper[4735]: W1008 14:51:27.145193 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a101e9b_760a_418b_9d59_67620afe8277.slice/crio-7a7705fd0c0d135b74a6a99df25acf167eaff03a23a0ae294f08383b4f36900e WatchSource:0}: Error finding container 7a7705fd0c0d135b74a6a99df25acf167eaff03a23a0ae294f08383b4f36900e: Status 404 returned error can't find the container with id 7a7705fd0c0d135b74a6a99df25acf167eaff03a23a0ae294f08383b4f36900e Oct 08 14:51:27 crc kubenswrapper[4735]: I1008 14:51:27.835262 4735 generic.go:334] "Generic (PLEG): container finished" podID="2a101e9b-760a-418b-9d59-67620afe8277" containerID="0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f" exitCode=0 Oct 08 14:51:27 crc kubenswrapper[4735]: I1008 14:51:27.835468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerDied","Data":"0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f"} Oct 08 14:51:27 crc kubenswrapper[4735]: I1008 14:51:27.836747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerStarted","Data":"7a7705fd0c0d135b74a6a99df25acf167eaff03a23a0ae294f08383b4f36900e"} Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.706563 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m7s8z"] Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.709382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.723154 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7s8z"] Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.803260 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-utilities\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.803700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6sj\" (UniqueName: \"kubernetes.io/projected/0efbb608-fa01-40ae-bb73-ef63e89de811-kube-api-access-5s6sj\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.803918 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-catalog-content\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.905567 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-catalog-content\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.905677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-utilities\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.905829 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6sj\" (UniqueName: \"kubernetes.io/projected/0efbb608-fa01-40ae-bb73-ef63e89de811-kube-api-access-5s6sj\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.906835 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-catalog-content\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.906890 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-utilities\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:28 crc kubenswrapper[4735]: I1008 14:51:28.931020 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6sj\" (UniqueName: \"kubernetes.io/projected/0efbb608-fa01-40ae-bb73-ef63e89de811-kube-api-access-5s6sj\") pod \"certified-operators-m7s8z\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:29 crc kubenswrapper[4735]: I1008 14:51:29.028685 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:29 crc kubenswrapper[4735]: I1008 14:51:29.610210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7s8z"] Oct 08 14:51:29 crc kubenswrapper[4735]: I1008 14:51:29.859793 4735 generic.go:334] "Generic (PLEG): container finished" podID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerID="e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f" exitCode=0 Oct 08 14:51:29 crc kubenswrapper[4735]: I1008 14:51:29.859840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerDied","Data":"e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f"} Oct 08 14:51:29 crc kubenswrapper[4735]: I1008 14:51:29.859897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerStarted","Data":"734dcac9b478225d24e6d6414a5d94a1e313146648eac80e0881a8c31096414f"} Oct 08 14:51:29 crc kubenswrapper[4735]: I1008 14:51:29.868014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerStarted","Data":"e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f"} Oct 08 14:51:29 crc kubenswrapper[4735]: E1008 14:51:29.960027 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a101e9b_760a_418b_9d59_67620afe8277.slice/crio-conmon-e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f.scope\": RecentStats: unable to find data in memory cache]" Oct 08 14:51:30 crc kubenswrapper[4735]: I1008 14:51:30.878785 4735 generic.go:334] "Generic (PLEG): container finished" podID="2a101e9b-760a-418b-9d59-67620afe8277" containerID="e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f" exitCode=0 Oct 08 14:51:30 crc kubenswrapper[4735]: I1008 14:51:30.878839 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerDied","Data":"e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f"} Oct 08 14:51:31 crc kubenswrapper[4735]: I1008 14:51:31.890201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerStarted","Data":"f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3"} Oct 08 14:51:31 crc kubenswrapper[4735]: I1008 14:51:31.893104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerStarted","Data":"80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c"} Oct 08 14:51:31 crc kubenswrapper[4735]: I1008 14:51:31.938949 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7nx9" podStartSLOduration=2.308843227 podStartE2EDuration="5.93892332s" podCreationTimestamp="2025-10-08 14:51:26 +0000 UTC" firstStartedPulling="2025-10-08 14:51:27.837695166 +0000 UTC m=+6937.645999710" lastFinishedPulling="2025-10-08 14:51:31.467775259 +0000 UTC m=+6941.276079803" observedRunningTime="2025-10-08 14:51:31.930969387 +0000 UTC m=+6941.739273951" watchObservedRunningTime="2025-10-08 14:51:31.93892332 +0000 UTC m=+6941.747228074" Oct 08 14:51:32 crc kubenswrapper[4735]: I1008 14:51:32.908113 4735 generic.go:334] "Generic (PLEG): container finished" podID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerID="f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3" exitCode=0 Oct 08 14:51:32 crc kubenswrapper[4735]: I1008 14:51:32.908501 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerDied","Data":"f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3"} Oct 08 14:51:33 crc kubenswrapper[4735]: I1008 14:51:33.927598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerStarted","Data":"0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc"} Oct 08 14:51:36 crc kubenswrapper[4735]: I1008 14:51:36.611859 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9gn6" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="registry-server" probeResult="failure" output=< Oct 08 14:51:36 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:51:36 crc kubenswrapper[4735]: > Oct 08 14:51:36 crc kubenswrapper[4735]: I1008 14:51:36.642932 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:36 crc kubenswrapper[4735]: I1008 14:51:36.643180 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:36 crc kubenswrapper[4735]: I1008 14:51:36.696584 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:36 crc kubenswrapper[4735]: I1008 14:51:36.726313 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m7s8z" podStartSLOduration=5.242293458 podStartE2EDuration="8.726275794s" podCreationTimestamp="2025-10-08 14:51:28 +0000 UTC" firstStartedPulling="2025-10-08 14:51:29.861696508 +0000 UTC m=+6939.670001052" lastFinishedPulling="2025-10-08 14:51:33.345678844 +0000 UTC m=+6943.153983388" observedRunningTime="2025-10-08 14:51:33.954473725 +0000 UTC m=+6943.762778289" watchObservedRunningTime="2025-10-08 14:51:36.726275794 +0000 UTC m=+6946.534580338" Oct 08 14:51:37 crc kubenswrapper[4735]: I1008 14:51:37.014098 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:39 crc kubenswrapper[4735]: I1008 14:51:39.028947 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:39 crc kubenswrapper[4735]: I1008 14:51:39.029538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:39 crc kubenswrapper[4735]: I1008 14:51:39.896432 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:51:39 crc kubenswrapper[4735]: E1008 14:51:39.896908 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:51:39 crc kubenswrapper[4735]: I1008 14:51:39.905652 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7nx9"] Oct 08 14:51:39 crc kubenswrapper[4735]: I1008 14:51:39.986419 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7nx9" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="registry-server" containerID="cri-o://80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c" gracePeriod=2 Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.076850 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-m7s8z" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="registry-server" probeResult="failure" output=< Oct 08 14:51:40 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 14:51:40 crc kubenswrapper[4735]: > Oct 08 14:51:40 crc kubenswrapper[4735]: E1008 14:51:40.274350 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a101e9b_760a_418b_9d59_67620afe8277.slice/crio-conmon-80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c.scope\": RecentStats: unable to find data in memory cache]" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.546541 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.566434 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-utilities\") pod \"2a101e9b-760a-418b-9d59-67620afe8277\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.566576 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz6hf\" (UniqueName: \"kubernetes.io/projected/2a101e9b-760a-418b-9d59-67620afe8277-kube-api-access-fz6hf\") pod \"2a101e9b-760a-418b-9d59-67620afe8277\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.566685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-catalog-content\") pod \"2a101e9b-760a-418b-9d59-67620afe8277\" (UID: \"2a101e9b-760a-418b-9d59-67620afe8277\") " Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.572397 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-utilities" (OuterVolumeSpecName: "utilities") pod "2a101e9b-760a-418b-9d59-67620afe8277" (UID: "2a101e9b-760a-418b-9d59-67620afe8277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.576720 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a101e9b-760a-418b-9d59-67620afe8277-kube-api-access-fz6hf" (OuterVolumeSpecName: "kube-api-access-fz6hf") pod "2a101e9b-760a-418b-9d59-67620afe8277" (UID: "2a101e9b-760a-418b-9d59-67620afe8277"). InnerVolumeSpecName "kube-api-access-fz6hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.580031 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a101e9b-760a-418b-9d59-67620afe8277" (UID: "2a101e9b-760a-418b-9d59-67620afe8277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.669775 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.669815 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz6hf\" (UniqueName: \"kubernetes.io/projected/2a101e9b-760a-418b-9d59-67620afe8277-kube-api-access-fz6hf\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.669828 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a101e9b-760a-418b-9d59-67620afe8277-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.996598 4735 generic.go:334] "Generic (PLEG): container finished" podID="2a101e9b-760a-418b-9d59-67620afe8277" containerID="80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c" exitCode=0 Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.996659 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerDied","Data":"80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c"} Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.996672 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7nx9" Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.996717 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7nx9" event={"ID":"2a101e9b-760a-418b-9d59-67620afe8277","Type":"ContainerDied","Data":"7a7705fd0c0d135b74a6a99df25acf167eaff03a23a0ae294f08383b4f36900e"} Oct 08 14:51:40 crc kubenswrapper[4735]: I1008 14:51:40.996745 4735 scope.go:117] "RemoveContainer" containerID="80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.019267 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7nx9"] Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.026819 4735 scope.go:117] "RemoveContainer" containerID="e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.034580 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7nx9"] Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.054879 4735 scope.go:117] "RemoveContainer" containerID="0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.100768 4735 scope.go:117] "RemoveContainer" containerID="80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c" Oct 08 14:51:41 crc kubenswrapper[4735]: E1008 14:51:41.101311 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c\": container with ID starting with 80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c not found: ID does not exist" containerID="80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.101346 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c"} err="failed to get container status \"80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c\": rpc error: code = NotFound desc = could not find container \"80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c\": container with ID starting with 80993f1f9df983a4a8422e243a1e8f0636ff5a1cd872fd27e62b02e68078ab5c not found: ID does not exist" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.101369 4735 scope.go:117] "RemoveContainer" containerID="e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f" Oct 08 14:51:41 crc kubenswrapper[4735]: E1008 14:51:41.101781 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f\": container with ID starting with e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f not found: ID does not exist" containerID="e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.101833 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f"} err="failed to get container status \"e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f\": rpc error: code = NotFound desc = could not find container \"e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f\": container with ID starting with e307f50d30709a6147382eb8ffcb4f3de819d31a59e476c98e7b59c81374433f not found: ID does not exist" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.101869 4735 scope.go:117] "RemoveContainer" containerID="0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f" Oct 08 14:51:41 crc kubenswrapper[4735]: E1008 14:51:41.102212 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f\": container with ID starting with 0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f not found: ID does not exist" containerID="0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f" Oct 08 14:51:41 crc kubenswrapper[4735]: I1008 14:51:41.102249 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f"} err="failed to get container status \"0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f\": rpc error: code = NotFound desc = could not find container \"0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f\": container with ID starting with 0aa2d7900ea1f489d05c6586edf6c6d3f846fcbcab572682934fc5e25bbfea7f not found: ID does not exist" Oct 08 14:51:42 crc kubenswrapper[4735]: I1008 14:51:42.911223 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a101e9b-760a-418b-9d59-67620afe8277" path="/var/lib/kubelet/pods/2a101e9b-760a-418b-9d59-67620afe8277/volumes" Oct 08 14:51:45 crc kubenswrapper[4735]: I1008 14:51:45.617934 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:45 crc kubenswrapper[4735]: I1008 14:51:45.672173 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:46 crc kubenswrapper[4735]: I1008 14:51:46.409623 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9gn6"] Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.066756 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l9gn6" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="registry-server" containerID="cri-o://534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff" gracePeriod=2 Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.591143 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.615992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk59h\" (UniqueName: \"kubernetes.io/projected/743d9187-1a75-4cad-8993-700240941d9c-kube-api-access-rk59h\") pod \"743d9187-1a75-4cad-8993-700240941d9c\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.616320 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-catalog-content\") pod \"743d9187-1a75-4cad-8993-700240941d9c\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.616907 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-utilities\") pod \"743d9187-1a75-4cad-8993-700240941d9c\" (UID: \"743d9187-1a75-4cad-8993-700240941d9c\") " Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.617381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-utilities" (OuterVolumeSpecName: "utilities") pod "743d9187-1a75-4cad-8993-700240941d9c" (UID: "743d9187-1a75-4cad-8993-700240941d9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.618380 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.627325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743d9187-1a75-4cad-8993-700240941d9c-kube-api-access-rk59h" (OuterVolumeSpecName: "kube-api-access-rk59h") pod "743d9187-1a75-4cad-8993-700240941d9c" (UID: "743d9187-1a75-4cad-8993-700240941d9c"). InnerVolumeSpecName "kube-api-access-rk59h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.717417 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "743d9187-1a75-4cad-8993-700240941d9c" (UID: "743d9187-1a75-4cad-8993-700240941d9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.720258 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk59h\" (UniqueName: \"kubernetes.io/projected/743d9187-1a75-4cad-8993-700240941d9c-kube-api-access-rk59h\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:47 crc kubenswrapper[4735]: I1008 14:51:47.720301 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743d9187-1a75-4cad-8993-700240941d9c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.077956 4735 generic.go:334] "Generic (PLEG): container finished" podID="743d9187-1a75-4cad-8993-700240941d9c" containerID="534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff" exitCode=0 Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.078006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerDied","Data":"534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff"} Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.078022 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9gn6" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.078039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9gn6" event={"ID":"743d9187-1a75-4cad-8993-700240941d9c","Type":"ContainerDied","Data":"e665195cf3d624fb7654eea4c82b9d2b8ea9255b3d3d816dfccb8e3c3b47071d"} Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.078060 4735 scope.go:117] "RemoveContainer" containerID="534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.103133 4735 scope.go:117] "RemoveContainer" containerID="93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.112570 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9gn6"] Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.122298 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l9gn6"] Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.142568 4735 scope.go:117] "RemoveContainer" containerID="9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.168090 4735 scope.go:117] "RemoveContainer" containerID="534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff" Oct 08 14:51:48 crc kubenswrapper[4735]: E1008 14:51:48.168439 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff\": container with ID starting with 534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff not found: ID does not exist" containerID="534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.168498 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff"} err="failed to get container status \"534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff\": rpc error: code = NotFound desc = could not find container \"534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff\": container with ID starting with 534d7e3c07fd99058cb6c055d03f368e9f7d8da34042f0faa95e5b718788b8ff not found: ID does not exist" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.168526 4735 scope.go:117] "RemoveContainer" containerID="93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0" Oct 08 14:51:48 crc kubenswrapper[4735]: E1008 14:51:48.168815 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0\": container with ID starting with 93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0 not found: ID does not exist" containerID="93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.168852 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0"} err="failed to get container status \"93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0\": rpc error: code = NotFound desc = could not find container \"93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0\": container with ID starting with 93e2bedfa66fd5e06248fdf0a54fb85cf87688cbea436a3e117662ca1b43bfc0 not found: ID does not exist" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.168876 4735 scope.go:117] "RemoveContainer" containerID="9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc" Oct 08 14:51:48 crc kubenswrapper[4735]: E1008 14:51:48.169067 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc\": container with ID starting with 9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc not found: ID does not exist" containerID="9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.169105 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc"} err="failed to get container status \"9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc\": rpc error: code = NotFound desc = could not find container \"9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc\": container with ID starting with 9293e0da3110c735a2dffbf3c9fe442419ce1ccc729720c5b20a0599d08593dc not found: ID does not exist" Oct 08 14:51:48 crc kubenswrapper[4735]: I1008 14:51:48.908799 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743d9187-1a75-4cad-8993-700240941d9c" path="/var/lib/kubelet/pods/743d9187-1a75-4cad-8993-700240941d9c/volumes" Oct 08 14:51:49 crc kubenswrapper[4735]: I1008 14:51:49.076920 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:49 crc kubenswrapper[4735]: I1008 14:51:49.133376 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:50 crc kubenswrapper[4735]: I1008 14:51:50.807896 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7s8z"] Oct 08 14:51:50 crc kubenswrapper[4735]: I1008 14:51:50.902966 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:51:50 crc kubenswrapper[4735]: E1008 14:51:50.903684 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.106710 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m7s8z" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="registry-server" containerID="cri-o://0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc" gracePeriod=2 Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.628031 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.721325 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-catalog-content\") pod \"0efbb608-fa01-40ae-bb73-ef63e89de811\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.721466 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-utilities\") pod \"0efbb608-fa01-40ae-bb73-ef63e89de811\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.721605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s6sj\" (UniqueName: \"kubernetes.io/projected/0efbb608-fa01-40ae-bb73-ef63e89de811-kube-api-access-5s6sj\") pod \"0efbb608-fa01-40ae-bb73-ef63e89de811\" (UID: \"0efbb608-fa01-40ae-bb73-ef63e89de811\") " Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.722932 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-utilities" (OuterVolumeSpecName: "utilities") pod "0efbb608-fa01-40ae-bb73-ef63e89de811" (UID: "0efbb608-fa01-40ae-bb73-ef63e89de811"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.730466 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efbb608-fa01-40ae-bb73-ef63e89de811-kube-api-access-5s6sj" (OuterVolumeSpecName: "kube-api-access-5s6sj") pod "0efbb608-fa01-40ae-bb73-ef63e89de811" (UID: "0efbb608-fa01-40ae-bb73-ef63e89de811"). InnerVolumeSpecName "kube-api-access-5s6sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.777956 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0efbb608-fa01-40ae-bb73-ef63e89de811" (UID: "0efbb608-fa01-40ae-bb73-ef63e89de811"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.826773 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.826824 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s6sj\" (UniqueName: \"kubernetes.io/projected/0efbb608-fa01-40ae-bb73-ef63e89de811-kube-api-access-5s6sj\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:51 crc kubenswrapper[4735]: I1008 14:51:51.826837 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0efbb608-fa01-40ae-bb73-ef63e89de811-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.116387 4735 generic.go:334] "Generic (PLEG): container finished" podID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerID="0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc" exitCode=0 Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.116455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerDied","Data":"0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc"} Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.116518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7s8z" event={"ID":"0efbb608-fa01-40ae-bb73-ef63e89de811","Type":"ContainerDied","Data":"734dcac9b478225d24e6d6414a5d94a1e313146648eac80e0881a8c31096414f"} Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.116591 4735 scope.go:117] "RemoveContainer" containerID="0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.116633 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7s8z" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.147588 4735 scope.go:117] "RemoveContainer" containerID="f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.154895 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7s8z"] Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.162851 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m7s8z"] Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.172198 4735 scope.go:117] "RemoveContainer" containerID="e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.217542 4735 scope.go:117] "RemoveContainer" containerID="0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc" Oct 08 14:51:52 crc kubenswrapper[4735]: E1008 14:51:52.217985 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc\": container with ID starting with 0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc not found: ID does not exist" containerID="0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.218048 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc"} err="failed to get container status \"0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc\": rpc error: code = NotFound desc = could not find container \"0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc\": container with ID starting with 0076d250c8877a65d60212a2f3da59e9b243ad190817a81bf4ba0dbd23224bfc not found: ID does not exist" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.218093 4735 scope.go:117] "RemoveContainer" containerID="f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3" Oct 08 14:51:52 crc kubenswrapper[4735]: E1008 14:51:52.218528 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3\": container with ID starting with f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3 not found: ID does not exist" containerID="f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.218580 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3"} err="failed to get container status \"f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3\": rpc error: code = NotFound desc = could not find container \"f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3\": container with ID starting with f427d29cd9182d65ffa0368ceaae38b714ff989733be9a3d5615953ff2a7dcf3 not found: ID does not exist" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.218602 4735 scope.go:117] "RemoveContainer" containerID="e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f" Oct 08 14:51:52 crc kubenswrapper[4735]: E1008 14:51:52.218932 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f\": container with ID starting with e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f not found: ID does not exist" containerID="e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.218985 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f"} err="failed to get container status \"e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f\": rpc error: code = NotFound desc = could not find container \"e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f\": container with ID starting with e33335bc4fb800593db95dd21b4ee8c87f14045da14f1c987cc3354aff8dcc5f not found: ID does not exist" Oct 08 14:51:52 crc kubenswrapper[4735]: I1008 14:51:52.906514 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" path="/var/lib/kubelet/pods/0efbb608-fa01-40ae-bb73-ef63e89de811/volumes" Oct 08 14:52:03 crc kubenswrapper[4735]: I1008 14:52:03.896131 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:52:03 crc kubenswrapper[4735]: E1008 14:52:03.897001 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:52:16 crc kubenswrapper[4735]: I1008 14:52:16.895298 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:52:16 crc kubenswrapper[4735]: E1008 14:52:16.896124 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:52:29 crc kubenswrapper[4735]: I1008 14:52:29.894962 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:52:29 crc kubenswrapper[4735]: E1008 14:52:29.895795 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:52:38 crc kubenswrapper[4735]: I1008 14:52:38.551876 4735 generic.go:334] "Generic (PLEG): container finished" podID="7124bfbc-c048-4d7a-be43-049e5bd6b16b" containerID="c79493f3916cc781a2ef7b1cad35c23861d0fed9d4ca658e2040029bb468657e" exitCode=0 Oct 08 14:52:38 crc kubenswrapper[4735]: I1008 14:52:38.551949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" event={"ID":"7124bfbc-c048-4d7a-be43-049e5bd6b16b","Type":"ContainerDied","Data":"c79493f3916cc781a2ef7b1cad35c23861d0fed9d4ca658e2040029bb468657e"} Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.001054 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.067941 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-inventory\") pod \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.068139 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-tripleo-cleanup-combined-ca-bundle\") pod \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.068193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx9wr\" (UniqueName: \"kubernetes.io/projected/7124bfbc-c048-4d7a-be43-049e5bd6b16b-kube-api-access-gx9wr\") pod \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.068230 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ceph\") pod \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.068259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ssh-key\") pod \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\" (UID: \"7124bfbc-c048-4d7a-be43-049e5bd6b16b\") " Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.074042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7124bfbc-c048-4d7a-be43-049e5bd6b16b-kube-api-access-gx9wr" (OuterVolumeSpecName: "kube-api-access-gx9wr") pod "7124bfbc-c048-4d7a-be43-049e5bd6b16b" (UID: "7124bfbc-c048-4d7a-be43-049e5bd6b16b"). InnerVolumeSpecName "kube-api-access-gx9wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.074152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ceph" (OuterVolumeSpecName: "ceph") pod "7124bfbc-c048-4d7a-be43-049e5bd6b16b" (UID: "7124bfbc-c048-4d7a-be43-049e5bd6b16b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.075322 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "7124bfbc-c048-4d7a-be43-049e5bd6b16b" (UID: "7124bfbc-c048-4d7a-be43-049e5bd6b16b"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.098151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-inventory" (OuterVolumeSpecName: "inventory") pod "7124bfbc-c048-4d7a-be43-049e5bd6b16b" (UID: "7124bfbc-c048-4d7a-be43-049e5bd6b16b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.100187 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7124bfbc-c048-4d7a-be43-049e5bd6b16b" (UID: "7124bfbc-c048-4d7a-be43-049e5bd6b16b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.171027 4735 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.171072 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx9wr\" (UniqueName: \"kubernetes.io/projected/7124bfbc-c048-4d7a-be43-049e5bd6b16b-kube-api-access-gx9wr\") on node \"crc\" DevicePath \"\"" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.171083 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.171092 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.171100 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7124bfbc-c048-4d7a-be43-049e5bd6b16b-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.572574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" event={"ID":"7124bfbc-c048-4d7a-be43-049e5bd6b16b","Type":"ContainerDied","Data":"3a1ae13f580ae4046936b1b217900305ece5165e14f93b674d88f3dd752650b4"} Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.572908 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a1ae13f580ae4046936b1b217900305ece5165e14f93b674d88f3dd752650b4" Oct 08 14:52:40 crc kubenswrapper[4735]: I1008 14:52:40.572654 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.772539 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-gwbg4"] Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773226 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="extract-content" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773239 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="extract-content" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773258 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="extract-content" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773264 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="extract-content" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773278 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773283 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773298 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773304 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773315 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="extract-utilities" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773322 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="extract-utilities" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773328 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="extract-content" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773334 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="extract-content" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773344 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="extract-utilities" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773353 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="extract-utilities" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773372 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773380 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773395 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7124bfbc-c048-4d7a-be43-049e5bd6b16b" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773402 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7124bfbc-c048-4d7a-be43-049e5bd6b16b" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 08 14:52:42 crc kubenswrapper[4735]: E1008 14:52:42.773422 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="extract-utilities" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773428 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="extract-utilities" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773650 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="743d9187-1a75-4cad-8993-700240941d9c" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773668 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a101e9b-760a-418b-9d59-67620afe8277" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773680 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7124bfbc-c048-4d7a-be43-049e5bd6b16b" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.773696 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efbb608-fa01-40ae-bb73-ef63e89de811" containerName="registry-server" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.774504 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.778924 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xccq5" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.779138 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.779323 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.789599 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.792724 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-gwbg4"] Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.929666 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.929772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ceph\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.929812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-inventory\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.929898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7bk\" (UniqueName: \"kubernetes.io/projected/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-kube-api-access-gf7bk\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:42 crc kubenswrapper[4735]: I1008 14:52:42.929985 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.032148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.032258 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ceph\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.032281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-inventory\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.032372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7bk\" (UniqueName: \"kubernetes.io/projected/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-kube-api-access-gf7bk\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.033867 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.039283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ceph\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.040235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.040715 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-inventory\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.053513 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7bk\" (UniqueName: \"kubernetes.io/projected/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-kube-api-access-gf7bk\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.058885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-gwbg4\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.104321 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:52:43 crc kubenswrapper[4735]: W1008 14:52:43.653880 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5765d83b_c2a5_4107_9a4c_7dc7a499ebcf.slice/crio-831f45cfd22572a2c7c3b02bb3c2fb555889fb6bd0a4f43ddcfe13f8c640b725 WatchSource:0}: Error finding container 831f45cfd22572a2c7c3b02bb3c2fb555889fb6bd0a4f43ddcfe13f8c640b725: Status 404 returned error can't find the container with id 831f45cfd22572a2c7c3b02bb3c2fb555889fb6bd0a4f43ddcfe13f8c640b725 Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.654196 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-gwbg4"] Oct 08 14:52:43 crc kubenswrapper[4735]: I1008 14:52:43.894727 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:52:43 crc kubenswrapper[4735]: E1008 14:52:43.895153 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:52:44 crc kubenswrapper[4735]: I1008 14:52:44.616761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" event={"ID":"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf","Type":"ContainerStarted","Data":"831f45cfd22572a2c7c3b02bb3c2fb555889fb6bd0a4f43ddcfe13f8c640b725"} Oct 08 14:52:45 crc kubenswrapper[4735]: I1008 14:52:45.628416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" event={"ID":"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf","Type":"ContainerStarted","Data":"38a4ceb9e6ea86f9561e856743badf7c430156ba2ea52100844f7d7a25ae7c77"} Oct 08 14:52:45 crc kubenswrapper[4735]: I1008 14:52:45.654850 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" podStartSLOduration=2.917148506 podStartE2EDuration="3.654829825s" podCreationTimestamp="2025-10-08 14:52:42 +0000 UTC" firstStartedPulling="2025-10-08 14:52:43.656611344 +0000 UTC m=+7013.464915888" lastFinishedPulling="2025-10-08 14:52:44.394292663 +0000 UTC m=+7014.202597207" observedRunningTime="2025-10-08 14:52:45.643735412 +0000 UTC m=+7015.452039976" watchObservedRunningTime="2025-10-08 14:52:45.654829825 +0000 UTC m=+7015.463134369" Oct 08 14:52:54 crc kubenswrapper[4735]: I1008 14:52:54.895768 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:52:54 crc kubenswrapper[4735]: E1008 14:52:54.896938 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:53:07 crc kubenswrapper[4735]: I1008 14:53:07.895290 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:53:08 crc kubenswrapper[4735]: I1008 14:53:08.837623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"40722c88e71ae1a6e983f1965064c72387eefa1ff0d32a7e87886f8c9640de17"} Oct 08 14:54:05 crc kubenswrapper[4735]: I1008 14:54:05.402934 4735 generic.go:334] "Generic (PLEG): container finished" podID="5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" containerID="38a4ceb9e6ea86f9561e856743badf7c430156ba2ea52100844f7d7a25ae7c77" exitCode=2 Oct 08 14:54:05 crc kubenswrapper[4735]: I1008 14:54:05.403021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" event={"ID":"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf","Type":"ContainerDied","Data":"38a4ceb9e6ea86f9561e856743badf7c430156ba2ea52100844f7d7a25ae7c77"} Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.855946 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.902656 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf7bk\" (UniqueName: \"kubernetes.io/projected/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-kube-api-access-gf7bk\") pod \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.902773 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-bootstrap-combined-ca-bundle\") pod \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.902825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-inventory\") pod \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.902921 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ceph\") pod \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.902978 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ssh-key\") pod \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\" (UID: \"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf\") " Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.909071 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-kube-api-access-gf7bk" (OuterVolumeSpecName: "kube-api-access-gf7bk") pod "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" (UID: "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf"). InnerVolumeSpecName "kube-api-access-gf7bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.909811 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ceph" (OuterVolumeSpecName: "ceph") pod "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" (UID: "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.911738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" (UID: "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.937157 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" (UID: "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:06 crc kubenswrapper[4735]: I1008 14:54:06.937426 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-inventory" (OuterVolumeSpecName: "inventory") pod "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" (UID: "5765d83b-c2a5-4107-9a4c-7dc7a499ebcf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.005956 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf7bk\" (UniqueName: \"kubernetes.io/projected/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-kube-api-access-gf7bk\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.006191 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.006206 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.006219 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.006231 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5765d83b-c2a5-4107-9a4c-7dc7a499ebcf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.424133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" event={"ID":"5765d83b-c2a5-4107-9a4c-7dc7a499ebcf","Type":"ContainerDied","Data":"831f45cfd22572a2c7c3b02bb3c2fb555889fb6bd0a4f43ddcfe13f8c640b725"} Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.424473 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="831f45cfd22572a2c7c3b02bb3c2fb555889fb6bd0a4f43ddcfe13f8c640b725" Oct 08 14:54:07 crc kubenswrapper[4735]: I1008 14:54:07.424260 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gwbg4" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.027284 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-2lw8z"] Oct 08 14:54:14 crc kubenswrapper[4735]: E1008 14:54:14.028450 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.028469 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.028760 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5765d83b-c2a5-4107-9a4c-7dc7a499ebcf" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.029630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.032914 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xccq5" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.033116 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.033273 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.033671 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.039153 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-2lw8z"] Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.161590 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ceph\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.161697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-inventory\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.161738 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.161817 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9p8\" (UniqueName: \"kubernetes.io/projected/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-kube-api-access-5q9p8\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.161834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.263183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.263279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9p8\" (UniqueName: \"kubernetes.io/projected/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-kube-api-access-5q9p8\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.263300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.263378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ceph\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.263447 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-inventory\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.270796 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.271975 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ceph\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.272127 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-inventory\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.273054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.280366 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9p8\" (UniqueName: \"kubernetes.io/projected/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-kube-api-access-5q9p8\") pod \"bootstrap-openstack-openstack-cell1-2lw8z\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.355681 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:14 crc kubenswrapper[4735]: I1008 14:54:14.913296 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-2lw8z"] Oct 08 14:54:15 crc kubenswrapper[4735]: I1008 14:54:15.520334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" event={"ID":"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259","Type":"ContainerStarted","Data":"dc91d7ff4f380e00bd4e57dad6c796c943cc6328cd0b30a0af96f2c7af2518a1"} Oct 08 14:54:17 crc kubenswrapper[4735]: I1008 14:54:17.540346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" event={"ID":"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259","Type":"ContainerStarted","Data":"03e4ffe4a1d720934bffd4c814f71e61d88d71bd161931a598f129962e833717"} Oct 08 14:54:17 crc kubenswrapper[4735]: I1008 14:54:17.571022 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" podStartSLOduration=2.221318523 podStartE2EDuration="3.570997908s" podCreationTimestamp="2025-10-08 14:54:14 +0000 UTC" firstStartedPulling="2025-10-08 14:54:14.91011554 +0000 UTC m=+7104.718420084" lastFinishedPulling="2025-10-08 14:54:16.259794925 +0000 UTC m=+7106.068099469" observedRunningTime="2025-10-08 14:54:17.560943954 +0000 UTC m=+7107.369248508" watchObservedRunningTime="2025-10-08 14:54:17.570997908 +0000 UTC m=+7107.379302452" Oct 08 14:54:48 crc kubenswrapper[4735]: I1008 14:54:48.857988 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" containerID="03e4ffe4a1d720934bffd4c814f71e61d88d71bd161931a598f129962e833717" exitCode=2 Oct 08 14:54:48 crc kubenswrapper[4735]: I1008 14:54:48.858085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" event={"ID":"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259","Type":"ContainerDied","Data":"03e4ffe4a1d720934bffd4c814f71e61d88d71bd161931a598f129962e833717"} Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.318397 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.506569 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-inventory\") pod \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.507503 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ssh-key\") pod \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.507567 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-bootstrap-combined-ca-bundle\") pod \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.507677 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ceph\") pod \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.507744 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9p8\" (UniqueName: \"kubernetes.io/projected/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-kube-api-access-5q9p8\") pod \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\" (UID: \"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259\") " Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.512668 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" (UID: "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.512987 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-kube-api-access-5q9p8" (OuterVolumeSpecName: "kube-api-access-5q9p8") pod "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" (UID: "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259"). InnerVolumeSpecName "kube-api-access-5q9p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.524512 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ceph" (OuterVolumeSpecName: "ceph") pod "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" (UID: "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.538095 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" (UID: "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.538544 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-inventory" (OuterVolumeSpecName: "inventory") pod "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" (UID: "bc9f6ea6-8afb-4f2d-a276-9e70b84e9259"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.610577 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.610618 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.610627 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.610640 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.610650 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9p8\" (UniqueName: \"kubernetes.io/projected/bc9f6ea6-8afb-4f2d-a276-9e70b84e9259-kube-api-access-5q9p8\") on node \"crc\" DevicePath \"\"" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.879264 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" event={"ID":"bc9f6ea6-8afb-4f2d-a276-9e70b84e9259","Type":"ContainerDied","Data":"dc91d7ff4f380e00bd4e57dad6c796c943cc6328cd0b30a0af96f2c7af2518a1"} Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.879328 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc91d7ff4f380e00bd4e57dad6c796c943cc6328cd0b30a0af96f2c7af2518a1" Oct 08 14:54:50 crc kubenswrapper[4735]: I1008 14:54:50.880123 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-2lw8z" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.031260 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lls9f"] Oct 08 14:55:07 crc kubenswrapper[4735]: E1008 14:55:07.032355 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.032376 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.032666 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f6ea6-8afb-4f2d-a276-9e70b84e9259" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.033757 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.035494 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.036714 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.036817 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xccq5" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.037590 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.041779 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lls9f"] Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.174201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ceph\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.174393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.174703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.175027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6224\" (UniqueName: \"kubernetes.io/projected/dd7d1e76-551e-4283-a224-c75b24e79656-kube-api-access-r6224\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.175157 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-inventory\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.278711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6224\" (UniqueName: \"kubernetes.io/projected/dd7d1e76-551e-4283-a224-c75b24e79656-kube-api-access-r6224\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.279227 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-inventory\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.279373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ceph\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.279438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.279578 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.285368 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-inventory\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.285367 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ceph\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.285494 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.285581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.301265 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6224\" (UniqueName: \"kubernetes.io/projected/dd7d1e76-551e-4283-a224-c75b24e79656-kube-api-access-r6224\") pod \"bootstrap-openstack-openstack-cell1-lls9f\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.355995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:07 crc kubenswrapper[4735]: I1008 14:55:07.915684 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lls9f"] Oct 08 14:55:08 crc kubenswrapper[4735]: I1008 14:55:08.073912 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" event={"ID":"dd7d1e76-551e-4283-a224-c75b24e79656","Type":"ContainerStarted","Data":"6471da001894e7d658128e2ebe9c14028f131162e2f4cee9d05d04ea2b25f61c"} Oct 08 14:55:09 crc kubenswrapper[4735]: I1008 14:55:09.085943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" event={"ID":"dd7d1e76-551e-4283-a224-c75b24e79656","Type":"ContainerStarted","Data":"e041756d632874dd7b3cf448086d1b477d91950f8d0e527f5c385377f40076f4"} Oct 08 14:55:09 crc kubenswrapper[4735]: I1008 14:55:09.106740 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" podStartSLOduration=1.5840527629999999 podStartE2EDuration="2.106720398s" podCreationTimestamp="2025-10-08 14:55:07 +0000 UTC" firstStartedPulling="2025-10-08 14:55:07.925458262 +0000 UTC m=+7157.733762806" lastFinishedPulling="2025-10-08 14:55:08.448125907 +0000 UTC m=+7158.256430441" observedRunningTime="2025-10-08 14:55:09.102674046 +0000 UTC m=+7158.910978590" watchObservedRunningTime="2025-10-08 14:55:09.106720398 +0000 UTC m=+7158.915024942" Oct 08 14:55:20 crc kubenswrapper[4735]: I1008 14:55:20.964864 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qmltg"] Oct 08 14:55:20 crc kubenswrapper[4735]: I1008 14:55:20.967931 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:20 crc kubenswrapper[4735]: I1008 14:55:20.978606 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmltg"] Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.088626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-utilities\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.088866 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-catalog-content\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.088978 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf4s2\" (UniqueName: \"kubernetes.io/projected/649d2348-6578-455f-9771-db94942f4a73-kube-api-access-wf4s2\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.191474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-utilities\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.191616 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-catalog-content\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.191678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf4s2\" (UniqueName: \"kubernetes.io/projected/649d2348-6578-455f-9771-db94942f4a73-kube-api-access-wf4s2\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.192528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-utilities\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.192765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-catalog-content\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.239047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf4s2\" (UniqueName: \"kubernetes.io/projected/649d2348-6578-455f-9771-db94942f4a73-kube-api-access-wf4s2\") pod \"community-operators-qmltg\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.299028 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:21 crc kubenswrapper[4735]: W1008 14:55:21.885628 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod649d2348_6578_455f_9771_db94942f4a73.slice/crio-42cb1076f253b7d0618141c1082aedba865e954597d8702a8db30a4505e5577b WatchSource:0}: Error finding container 42cb1076f253b7d0618141c1082aedba865e954597d8702a8db30a4505e5577b: Status 404 returned error can't find the container with id 42cb1076f253b7d0618141c1082aedba865e954597d8702a8db30a4505e5577b Oct 08 14:55:21 crc kubenswrapper[4735]: I1008 14:55:21.890451 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmltg"] Oct 08 14:55:22 crc kubenswrapper[4735]: I1008 14:55:22.234912 4735 generic.go:334] "Generic (PLEG): container finished" podID="649d2348-6578-455f-9771-db94942f4a73" containerID="e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6" exitCode=0 Oct 08 14:55:22 crc kubenswrapper[4735]: I1008 14:55:22.234967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerDied","Data":"e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6"} Oct 08 14:55:22 crc kubenswrapper[4735]: I1008 14:55:22.235765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerStarted","Data":"42cb1076f253b7d0618141c1082aedba865e954597d8702a8db30a4505e5577b"} Oct 08 14:55:24 crc kubenswrapper[4735]: I1008 14:55:24.254735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerStarted","Data":"34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2"} Oct 08 14:55:24 crc kubenswrapper[4735]: I1008 14:55:24.879099 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:55:24 crc kubenswrapper[4735]: I1008 14:55:24.879514 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:55:26 crc kubenswrapper[4735]: I1008 14:55:26.280048 4735 generic.go:334] "Generic (PLEG): container finished" podID="649d2348-6578-455f-9771-db94942f4a73" containerID="34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2" exitCode=0 Oct 08 14:55:26 crc kubenswrapper[4735]: I1008 14:55:26.280172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerDied","Data":"34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2"} Oct 08 14:55:27 crc kubenswrapper[4735]: I1008 14:55:27.304002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerStarted","Data":"8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f"} Oct 08 14:55:27 crc kubenswrapper[4735]: I1008 14:55:27.326232 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qmltg" podStartSLOduration=2.653476395 podStartE2EDuration="7.32621254s" podCreationTimestamp="2025-10-08 14:55:20 +0000 UTC" firstStartedPulling="2025-10-08 14:55:22.236630486 +0000 UTC m=+7172.044935030" lastFinishedPulling="2025-10-08 14:55:26.909366631 +0000 UTC m=+7176.717671175" observedRunningTime="2025-10-08 14:55:27.325250886 +0000 UTC m=+7177.133555450" watchObservedRunningTime="2025-10-08 14:55:27.32621254 +0000 UTC m=+7177.134517084" Oct 08 14:55:31 crc kubenswrapper[4735]: I1008 14:55:31.299689 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:31 crc kubenswrapper[4735]: I1008 14:55:31.300351 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:31 crc kubenswrapper[4735]: I1008 14:55:31.343873 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:37 crc kubenswrapper[4735]: I1008 14:55:37.400551 4735 generic.go:334] "Generic (PLEG): container finished" podID="dd7d1e76-551e-4283-a224-c75b24e79656" containerID="e041756d632874dd7b3cf448086d1b477d91950f8d0e527f5c385377f40076f4" exitCode=2 Oct 08 14:55:37 crc kubenswrapper[4735]: I1008 14:55:37.400634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" event={"ID":"dd7d1e76-551e-4283-a224-c75b24e79656","Type":"ContainerDied","Data":"e041756d632874dd7b3cf448086d1b477d91950f8d0e527f5c385377f40076f4"} Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.880004 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.937006 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-bootstrap-combined-ca-bundle\") pod \"dd7d1e76-551e-4283-a224-c75b24e79656\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.937056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ssh-key\") pod \"dd7d1e76-551e-4283-a224-c75b24e79656\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.937087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6224\" (UniqueName: \"kubernetes.io/projected/dd7d1e76-551e-4283-a224-c75b24e79656-kube-api-access-r6224\") pod \"dd7d1e76-551e-4283-a224-c75b24e79656\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.937151 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-inventory\") pod \"dd7d1e76-551e-4283-a224-c75b24e79656\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.937318 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ceph\") pod \"dd7d1e76-551e-4283-a224-c75b24e79656\" (UID: \"dd7d1e76-551e-4283-a224-c75b24e79656\") " Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.942923 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7d1e76-551e-4283-a224-c75b24e79656-kube-api-access-r6224" (OuterVolumeSpecName: "kube-api-access-r6224") pod "dd7d1e76-551e-4283-a224-c75b24e79656" (UID: "dd7d1e76-551e-4283-a224-c75b24e79656"). InnerVolumeSpecName "kube-api-access-r6224". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.948868 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ceph" (OuterVolumeSpecName: "ceph") pod "dd7d1e76-551e-4283-a224-c75b24e79656" (UID: "dd7d1e76-551e-4283-a224-c75b24e79656"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.967929 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dd7d1e76-551e-4283-a224-c75b24e79656" (UID: "dd7d1e76-551e-4283-a224-c75b24e79656"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.973038 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-inventory" (OuterVolumeSpecName: "inventory") pod "dd7d1e76-551e-4283-a224-c75b24e79656" (UID: "dd7d1e76-551e-4283-a224-c75b24e79656"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:55:38 crc kubenswrapper[4735]: I1008 14:55:38.973582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd7d1e76-551e-4283-a224-c75b24e79656" (UID: "dd7d1e76-551e-4283-a224-c75b24e79656"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.038944 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.038988 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.039003 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.039016 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6224\" (UniqueName: \"kubernetes.io/projected/dd7d1e76-551e-4283-a224-c75b24e79656-kube-api-access-r6224\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.039027 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7d1e76-551e-4283-a224-c75b24e79656-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.420317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" event={"ID":"dd7d1e76-551e-4283-a224-c75b24e79656","Type":"ContainerDied","Data":"6471da001894e7d658128e2ebe9c14028f131162e2f4cee9d05d04ea2b25f61c"} Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.420704 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6471da001894e7d658128e2ebe9c14028f131162e2f4cee9d05d04ea2b25f61c" Oct 08 14:55:39 crc kubenswrapper[4735]: I1008 14:55:39.420781 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lls9f" Oct 08 14:55:41 crc kubenswrapper[4735]: I1008 14:55:41.345919 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:41 crc kubenswrapper[4735]: I1008 14:55:41.393704 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmltg"] Oct 08 14:55:41 crc kubenswrapper[4735]: I1008 14:55:41.436100 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qmltg" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="registry-server" containerID="cri-o://8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f" gracePeriod=2 Oct 08 14:55:41 crc kubenswrapper[4735]: I1008 14:55:41.968224 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.105623 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf4s2\" (UniqueName: \"kubernetes.io/projected/649d2348-6578-455f-9771-db94942f4a73-kube-api-access-wf4s2\") pod \"649d2348-6578-455f-9771-db94942f4a73\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.105761 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-catalog-content\") pod \"649d2348-6578-455f-9771-db94942f4a73\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.105840 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-utilities\") pod \"649d2348-6578-455f-9771-db94942f4a73\" (UID: \"649d2348-6578-455f-9771-db94942f4a73\") " Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.107287 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-utilities" (OuterVolumeSpecName: "utilities") pod "649d2348-6578-455f-9771-db94942f4a73" (UID: "649d2348-6578-455f-9771-db94942f4a73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.116426 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649d2348-6578-455f-9771-db94942f4a73-kube-api-access-wf4s2" (OuterVolumeSpecName: "kube-api-access-wf4s2") pod "649d2348-6578-455f-9771-db94942f4a73" (UID: "649d2348-6578-455f-9771-db94942f4a73"). InnerVolumeSpecName "kube-api-access-wf4s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.161034 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "649d2348-6578-455f-9771-db94942f4a73" (UID: "649d2348-6578-455f-9771-db94942f4a73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.208563 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf4s2\" (UniqueName: \"kubernetes.io/projected/649d2348-6578-455f-9771-db94942f4a73-kube-api-access-wf4s2\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.208594 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.208609 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649d2348-6578-455f-9771-db94942f4a73-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.448478 4735 generic.go:334] "Generic (PLEG): container finished" podID="649d2348-6578-455f-9771-db94942f4a73" containerID="8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f" exitCode=0 Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.448563 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmltg" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.448595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerDied","Data":"8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f"} Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.449425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmltg" event={"ID":"649d2348-6578-455f-9771-db94942f4a73","Type":"ContainerDied","Data":"42cb1076f253b7d0618141c1082aedba865e954597d8702a8db30a4505e5577b"} Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.449444 4735 scope.go:117] "RemoveContainer" containerID="8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.487186 4735 scope.go:117] "RemoveContainer" containerID="34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.490700 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmltg"] Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.508209 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qmltg"] Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.528971 4735 scope.go:117] "RemoveContainer" containerID="e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.576268 4735 scope.go:117] "RemoveContainer" containerID="8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f" Oct 08 14:55:42 crc kubenswrapper[4735]: E1008 14:55:42.578018 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f\": container with ID starting with 8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f not found: ID does not exist" containerID="8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.578066 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f"} err="failed to get container status \"8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f\": rpc error: code = NotFound desc = could not find container \"8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f\": container with ID starting with 8e250263ceda24b62abc89be42aae182c1e579893acfa401fad5be4457625e9f not found: ID does not exist" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.578091 4735 scope.go:117] "RemoveContainer" containerID="34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2" Oct 08 14:55:42 crc kubenswrapper[4735]: E1008 14:55:42.578522 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2\": container with ID starting with 34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2 not found: ID does not exist" containerID="34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.578572 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2"} err="failed to get container status \"34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2\": rpc error: code = NotFound desc = could not find container \"34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2\": container with ID starting with 34a04186e6ac796ca0fd1ddcbdefd0ebf468e390ac18bc414e869864aeed41e2 not found: ID does not exist" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.578599 4735 scope.go:117] "RemoveContainer" containerID="e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6" Oct 08 14:55:42 crc kubenswrapper[4735]: E1008 14:55:42.579029 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6\": container with ID starting with e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6 not found: ID does not exist" containerID="e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.579056 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6"} err="failed to get container status \"e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6\": rpc error: code = NotFound desc = could not find container \"e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6\": container with ID starting with e97dc5526073ca4bf29546bf941719ea1db05c184fec73458e1343344271e6d6 not found: ID does not exist" Oct 08 14:55:42 crc kubenswrapper[4735]: I1008 14:55:42.906820 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649d2348-6578-455f-9771-db94942f4a73" path="/var/lib/kubelet/pods/649d2348-6578-455f-9771-db94942f4a73/volumes" Oct 08 14:55:54 crc kubenswrapper[4735]: I1008 14:55:54.878114 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:55:54 crc kubenswrapper[4735]: I1008 14:55:54.878686 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.036301 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bdhf5"] Oct 08 14:56:17 crc kubenswrapper[4735]: E1008 14:56:17.037364 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="extract-content" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.037379 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="extract-content" Oct 08 14:56:17 crc kubenswrapper[4735]: E1008 14:56:17.037399 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="extract-utilities" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.037405 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="extract-utilities" Oct 08 14:56:17 crc kubenswrapper[4735]: E1008 14:56:17.037424 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7d1e76-551e-4283-a224-c75b24e79656" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.037433 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7d1e76-551e-4283-a224-c75b24e79656" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:56:17 crc kubenswrapper[4735]: E1008 14:56:17.037444 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="registry-server" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.037450 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="registry-server" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.037672 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7d1e76-551e-4283-a224-c75b24e79656" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.037692 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="649d2348-6578-455f-9771-db94942f4a73" containerName="registry-server" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.038588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.041839 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.042025 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.042242 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xccq5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.046540 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bdhf5"] Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.047004 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.157496 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzb72\" (UniqueName: \"kubernetes.io/projected/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-kube-api-access-nzb72\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.157585 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.157759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ceph\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.158061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.158162 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-inventory\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.260895 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ceph\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.261086 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.261139 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-inventory\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.261262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzb72\" (UniqueName: \"kubernetes.io/projected/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-kube-api-access-nzb72\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.261313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.270161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-inventory\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.271089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ceph\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.271907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.275005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.283347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzb72\" (UniqueName: \"kubernetes.io/projected/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-kube-api-access-nzb72\") pod \"bootstrap-openstack-openstack-cell1-bdhf5\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.375798 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.971587 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bdhf5"] Oct 08 14:56:17 crc kubenswrapper[4735]: I1008 14:56:17.979445 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 14:56:18 crc kubenswrapper[4735]: I1008 14:56:18.811081 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" event={"ID":"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c","Type":"ContainerStarted","Data":"df8b84821b2463f135188150757e9bb97a3022316124d56a7ed16e21ac09b895"} Oct 08 14:56:18 crc kubenswrapper[4735]: I1008 14:56:18.811439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" event={"ID":"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c","Type":"ContainerStarted","Data":"c91faf2701df0c03281e0af7cbe26df1c17055098106f911a3a0527b73600c60"} Oct 08 14:56:18 crc kubenswrapper[4735]: I1008 14:56:18.834916 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" podStartSLOduration=1.324672279 podStartE2EDuration="1.834895179s" podCreationTimestamp="2025-10-08 14:56:17 +0000 UTC" firstStartedPulling="2025-10-08 14:56:17.979213284 +0000 UTC m=+7227.787517828" lastFinishedPulling="2025-10-08 14:56:18.489436184 +0000 UTC m=+7228.297740728" observedRunningTime="2025-10-08 14:56:18.833106884 +0000 UTC m=+7228.641411488" watchObservedRunningTime="2025-10-08 14:56:18.834895179 +0000 UTC m=+7228.643199723" Oct 08 14:56:24 crc kubenswrapper[4735]: I1008 14:56:24.878619 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:56:24 crc kubenswrapper[4735]: I1008 14:56:24.879105 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:56:24 crc kubenswrapper[4735]: I1008 14:56:24.879150 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:56:24 crc kubenswrapper[4735]: I1008 14:56:24.880352 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40722c88e71ae1a6e983f1965064c72387eefa1ff0d32a7e87886f8c9640de17"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:56:24 crc kubenswrapper[4735]: I1008 14:56:24.880423 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://40722c88e71ae1a6e983f1965064c72387eefa1ff0d32a7e87886f8c9640de17" gracePeriod=600 Oct 08 14:56:25 crc kubenswrapper[4735]: I1008 14:56:25.904046 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="40722c88e71ae1a6e983f1965064c72387eefa1ff0d32a7e87886f8c9640de17" exitCode=0 Oct 08 14:56:25 crc kubenswrapper[4735]: I1008 14:56:25.904126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"40722c88e71ae1a6e983f1965064c72387eefa1ff0d32a7e87886f8c9640de17"} Oct 08 14:56:25 crc kubenswrapper[4735]: I1008 14:56:25.904600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb"} Oct 08 14:56:25 crc kubenswrapper[4735]: I1008 14:56:25.904621 4735 scope.go:117] "RemoveContainer" containerID="11ae741bfa9577c6a72bc007997b80ddaf8323f0c8648ac8d66f8abb80919d56" Oct 08 14:56:48 crc kubenswrapper[4735]: I1008 14:56:48.113037 4735 generic.go:334] "Generic (PLEG): container finished" podID="16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" containerID="df8b84821b2463f135188150757e9bb97a3022316124d56a7ed16e21ac09b895" exitCode=2 Oct 08 14:56:48 crc kubenswrapper[4735]: I1008 14:56:48.113584 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" event={"ID":"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c","Type":"ContainerDied","Data":"df8b84821b2463f135188150757e9bb97a3022316124d56a7ed16e21ac09b895"} Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.605598 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.701397 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-inventory\") pod \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.701652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-bootstrap-combined-ca-bundle\") pod \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.701687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzb72\" (UniqueName: \"kubernetes.io/projected/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-kube-api-access-nzb72\") pod \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.701765 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ssh-key\") pod \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.701818 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ceph\") pod \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\" (UID: \"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c\") " Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.708505 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" (UID: "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.708605 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ceph" (OuterVolumeSpecName: "ceph") pod "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" (UID: "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.709707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-kube-api-access-nzb72" (OuterVolumeSpecName: "kube-api-access-nzb72") pod "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" (UID: "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c"). InnerVolumeSpecName "kube-api-access-nzb72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.735556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" (UID: "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.736557 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-inventory" (OuterVolumeSpecName: "inventory") pod "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" (UID: "16c02f1e-e6e3-446d-a10f-4459c8a1bc9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.804220 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.804256 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.804275 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzb72\" (UniqueName: \"kubernetes.io/projected/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-kube-api-access-nzb72\") on node \"crc\" DevicePath \"\"" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.804288 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 14:56:49 crc kubenswrapper[4735]: I1008 14:56:49.804297 4735 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/16c02f1e-e6e3-446d-a10f-4459c8a1bc9c-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 14:56:50 crc kubenswrapper[4735]: I1008 14:56:50.133635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" event={"ID":"16c02f1e-e6e3-446d-a10f-4459c8a1bc9c","Type":"ContainerDied","Data":"c91faf2701df0c03281e0af7cbe26df1c17055098106f911a3a0527b73600c60"} Oct 08 14:56:50 crc kubenswrapper[4735]: I1008 14:56:50.133694 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91faf2701df0c03281e0af7cbe26df1c17055098106f911a3a0527b73600c60" Oct 08 14:56:50 crc kubenswrapper[4735]: I1008 14:56:50.133701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bdhf5" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.214953 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xzdj/must-gather-wstqn"] Oct 08 14:58:25 crc kubenswrapper[4735]: E1008 14:58:25.216113 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.216132 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.216395 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c02f1e-e6e3-446d-a10f-4459c8a1bc9c" containerName="bootstrap-openstack-openstack-cell1" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.217923 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.220723 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9xzdj"/"default-dockercfg-q9shd" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.220945 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9xzdj"/"kube-root-ca.crt" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.221097 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9xzdj"/"openshift-service-ca.crt" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.234570 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9xzdj/must-gather-wstqn"] Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.359739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4nbh\" (UniqueName: \"kubernetes.io/projected/e0b71b21-6e14-41d1-97c1-3dc43e232014-kube-api-access-g4nbh\") pod \"must-gather-wstqn\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.359793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0b71b21-6e14-41d1-97c1-3dc43e232014-must-gather-output\") pod \"must-gather-wstqn\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.462289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4nbh\" (UniqueName: \"kubernetes.io/projected/e0b71b21-6e14-41d1-97c1-3dc43e232014-kube-api-access-g4nbh\") pod \"must-gather-wstqn\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.462343 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0b71b21-6e14-41d1-97c1-3dc43e232014-must-gather-output\") pod \"must-gather-wstqn\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.462872 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0b71b21-6e14-41d1-97c1-3dc43e232014-must-gather-output\") pod \"must-gather-wstqn\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.484157 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4nbh\" (UniqueName: \"kubernetes.io/projected/e0b71b21-6e14-41d1-97c1-3dc43e232014-kube-api-access-g4nbh\") pod \"must-gather-wstqn\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:25 crc kubenswrapper[4735]: I1008 14:58:25.543396 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 14:58:26 crc kubenswrapper[4735]: I1008 14:58:26.211945 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9xzdj/must-gather-wstqn"] Oct 08 14:58:27 crc kubenswrapper[4735]: I1008 14:58:27.073652 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/must-gather-wstqn" event={"ID":"e0b71b21-6e14-41d1-97c1-3dc43e232014","Type":"ContainerStarted","Data":"cfb8177e7767a7c5525aba905afa15fdabd8772f9913d7be1b7f8683c7724c2b"} Oct 08 14:58:32 crc kubenswrapper[4735]: I1008 14:58:32.134662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/must-gather-wstqn" event={"ID":"e0b71b21-6e14-41d1-97c1-3dc43e232014","Type":"ContainerStarted","Data":"ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b"} Oct 08 14:58:32 crc kubenswrapper[4735]: I1008 14:58:32.135203 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/must-gather-wstqn" event={"ID":"e0b71b21-6e14-41d1-97c1-3dc43e232014","Type":"ContainerStarted","Data":"dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569"} Oct 08 14:58:32 crc kubenswrapper[4735]: I1008 14:58:32.157734 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9xzdj/must-gather-wstqn" podStartSLOduration=2.119960134 podStartE2EDuration="7.157719439s" podCreationTimestamp="2025-10-08 14:58:25 +0000 UTC" firstStartedPulling="2025-10-08 14:58:26.215328131 +0000 UTC m=+7356.023632675" lastFinishedPulling="2025-10-08 14:58:31.253087426 +0000 UTC m=+7361.061391980" observedRunningTime="2025-10-08 14:58:32.155956525 +0000 UTC m=+7361.964261069" watchObservedRunningTime="2025-10-08 14:58:32.157719439 +0000 UTC m=+7361.966023983" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.666640 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-7vvwt"] Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.696346 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.836851 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpz8q\" (UniqueName: \"kubernetes.io/projected/47fbf1b2-81fc-4660-8103-af767c2894a7-kube-api-access-hpz8q\") pod \"crc-debug-7vvwt\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.837546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47fbf1b2-81fc-4660-8103-af767c2894a7-host\") pod \"crc-debug-7vvwt\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.940141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpz8q\" (UniqueName: \"kubernetes.io/projected/47fbf1b2-81fc-4660-8103-af767c2894a7-kube-api-access-hpz8q\") pod \"crc-debug-7vvwt\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.940245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47fbf1b2-81fc-4660-8103-af767c2894a7-host\") pod \"crc-debug-7vvwt\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.940413 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47fbf1b2-81fc-4660-8103-af767c2894a7-host\") pod \"crc-debug-7vvwt\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:36 crc kubenswrapper[4735]: I1008 14:58:36.959516 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpz8q\" (UniqueName: \"kubernetes.io/projected/47fbf1b2-81fc-4660-8103-af767c2894a7-kube-api-access-hpz8q\") pod \"crc-debug-7vvwt\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:37 crc kubenswrapper[4735]: I1008 14:58:37.031562 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 14:58:37 crc kubenswrapper[4735]: I1008 14:58:37.187998 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" event={"ID":"47fbf1b2-81fc-4660-8103-af767c2894a7","Type":"ContainerStarted","Data":"b69787d73b3bbcccc2d8ebf44c88fdeded6b019d34d95053632db4cdbba59551"} Oct 08 14:58:50 crc kubenswrapper[4735]: I1008 14:58:50.316941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" event={"ID":"47fbf1b2-81fc-4660-8103-af767c2894a7","Type":"ContainerStarted","Data":"6b9fc99cd73cb707937434d01a89384145f8fdf14abd62ea8efe79b5853fbf62"} Oct 08 14:58:50 crc kubenswrapper[4735]: I1008 14:58:50.339781 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" podStartSLOduration=1.866298434 podStartE2EDuration="14.339759354s" podCreationTimestamp="2025-10-08 14:58:36 +0000 UTC" firstStartedPulling="2025-10-08 14:58:37.073366716 +0000 UTC m=+7366.881671260" lastFinishedPulling="2025-10-08 14:58:49.546827636 +0000 UTC m=+7379.355132180" observedRunningTime="2025-10-08 14:58:50.335158288 +0000 UTC m=+7380.143462832" watchObservedRunningTime="2025-10-08 14:58:50.339759354 +0000 UTC m=+7380.148063898" Oct 08 14:58:54 crc kubenswrapper[4735]: I1008 14:58:54.878409 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:58:54 crc kubenswrapper[4735]: I1008 14:58:54.879052 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:59:24 crc kubenswrapper[4735]: I1008 14:59:24.878829 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:59:24 crc kubenswrapper[4735]: I1008 14:59:24.879378 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:59:54 crc kubenswrapper[4735]: I1008 14:59:54.878940 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 14:59:54 crc kubenswrapper[4735]: I1008 14:59:54.879583 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 14:59:54 crc kubenswrapper[4735]: I1008 14:59:54.879631 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" Oct 08 14:59:54 crc kubenswrapper[4735]: I1008 14:59:54.880445 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb"} pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 14:59:54 crc kubenswrapper[4735]: I1008 14:59:54.880532 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" containerID="cri-o://584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" gracePeriod=600 Oct 08 14:59:55 crc kubenswrapper[4735]: E1008 14:59:55.066233 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:59:55 crc kubenswrapper[4735]: I1008 14:59:55.977926 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" exitCode=0 Oct 08 14:59:55 crc kubenswrapper[4735]: I1008 14:59:55.978002 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerDied","Data":"584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb"} Oct 08 14:59:55 crc kubenswrapper[4735]: I1008 14:59:55.978253 4735 scope.go:117] "RemoveContainer" containerID="40722c88e71ae1a6e983f1965064c72387eefa1ff0d32a7e87886f8c9640de17" Oct 08 14:59:55 crc kubenswrapper[4735]: I1008 14:59:55.978998 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 14:59:55 crc kubenswrapper[4735]: E1008 14:59:55.979357 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.329838 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_91970db8-91a6-4c32-ab99-6c72d08118bb/init-config-reloader/0.log" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.548175 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_91970db8-91a6-4c32-ab99-6c72d08118bb/alertmanager/0.log" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.632616 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_91970db8-91a6-4c32-ab99-6c72d08118bb/init-config-reloader/0.log" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.632870 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_91970db8-91a6-4c32-ab99-6c72d08118bb/config-reloader/0.log" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.831527 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5fc9ad16-4a44-4b09-8622-50ab55a4430c/aodh-api/0.log" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.867994 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5fc9ad16-4a44-4b09-8622-50ab55a4430c/aodh-evaluator/0.log" Oct 08 14:59:57 crc kubenswrapper[4735]: I1008 14:59:57.990073 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5fc9ad16-4a44-4b09-8622-50ab55a4430c/aodh-notifier/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.049356 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5fc9ad16-4a44-4b09-8622-50ab55a4430c/aodh-listener/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.195099 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79b49b68c6-7bkwh_8e685548-ea9a-40b5-b68a-812e88a9170d/barbican-api/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.261028 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79b49b68c6-7bkwh_8e685548-ea9a-40b5-b68a-812e88a9170d/barbican-api-log/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.418540 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d64898b78-7f66b_a6bcda23-f2e7-4534-a215-9f535010166b/barbican-keystone-listener/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.467572 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d64898b78-7f66b_a6bcda23-f2e7-4534-a215-9f535010166b/barbican-keystone-listener-log/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.665131 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cdc76b747-5kslq_059d447b-ce74-4cb3-9ed9-ddb4f50d983c/barbican-worker/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.682255 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cdc76b747-5kslq_059d447b-ce74-4cb3-9ed9-ddb4f50d983c/barbican-worker-log/0.log" Oct 08 14:59:58 crc kubenswrapper[4735]: I1008 14:59:58.867818 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-2lw8z_bc9f6ea6-8afb-4f2d-a276-9e70b84e9259/bootstrap-openstack-openstack-cell1/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.088236 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-bdhf5_16c02f1e-e6e3-446d-a10f-4459c8a1bc9c/bootstrap-openstack-openstack-cell1/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.192587 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-gwbg4_5765d83b-c2a5-4107-9a4c-7dc7a499ebcf/bootstrap-openstack-openstack-cell1/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.450020 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-lls9f_dd7d1e76-551e-4283-a224-c75b24e79656/bootstrap-openstack-openstack-cell1/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.714000 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_410f0f14-7078-4621-b8b0-6c507c16fe39/ceilometer-notification-agent/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.726439 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_410f0f14-7078-4621-b8b0-6c507c16fe39/ceilometer-central-agent/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.767610 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_410f0f14-7078-4621-b8b0-6c507c16fe39/proxy-httpd/0.log" Oct 08 14:59:59 crc kubenswrapper[4735]: I1008 14:59:59.971189 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_410f0f14-7078-4621-b8b0-6c507c16fe39/sg-core/0.log" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.074074 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cab1c165-0d71-4e95-8e2a-ee9bbd5b650c/cinder-api/0.log" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.147326 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld"] Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.149468 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.154378 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.154586 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.168931 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld"] Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.255435 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e660034-6529-42c4-971a-5851f66c7905-secret-volume\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.255928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krclr\" (UniqueName: \"kubernetes.io/projected/5e660034-6529-42c4-971a-5851f66c7905-kube-api-access-krclr\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.256151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e660034-6529-42c4-971a-5851f66c7905-config-volume\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.257235 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cab1c165-0d71-4e95-8e2a-ee9bbd5b650c/cinder-api-log/0.log" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.358313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e660034-6529-42c4-971a-5851f66c7905-secret-volume\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.358453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krclr\" (UniqueName: \"kubernetes.io/projected/5e660034-6529-42c4-971a-5851f66c7905-kube-api-access-krclr\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.358550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e660034-6529-42c4-971a-5851f66c7905-config-volume\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.359595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e660034-6529-42c4-971a-5851f66c7905-config-volume\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.547076 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e660034-6529-42c4-971a-5851f66c7905-secret-volume\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.547195 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krclr\" (UniqueName: \"kubernetes.io/projected/5e660034-6529-42c4-971a-5851f66c7905-kube-api-access-krclr\") pod \"collect-profiles-29332260-l9hld\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.577672 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8afe5a8e-fa52-4d98-a549-284ff784f779/cinder-backup/0.log" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.579131 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8afe5a8e-fa52-4d98-a549-284ff784f779/probe/0.log" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.786606 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b3017100-96d4-4e28-ade2-3c0d3fe5462b/cinder-scheduler/0.log" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.798916 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:00 crc kubenswrapper[4735]: I1008 15:00:00.846813 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b3017100-96d4-4e28-ade2-3c0d3fe5462b/probe/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.095161 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d87f7873-ab8a-4ac5-ab3f-f3613f8dec25/probe/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.140440 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d87f7873-ab8a-4ac5-ab3f-f3613f8dec25/cinder-volume/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.164647 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6b45bb5c-54t6j_725b3e41-8e45-4284-b107-c0394209df22/init/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.455674 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c742b20d-b837-4af4-9bd0-61fe5b2799f0/glance-httpd/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.466275 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6b45bb5c-54t6j_725b3e41-8e45-4284-b107-c0394209df22/init/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.480885 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6b45bb5c-54t6j_725b3e41-8e45-4284-b107-c0394209df22/dnsmasq-dns/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.602225 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld"] Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.707729 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c742b20d-b837-4af4-9bd0-61fe5b2799f0/glance-log/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.792596 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fd641231-7640-4c7b-8e6f-dfa24dc31650/glance-log/0.log" Oct 08 15:00:01 crc kubenswrapper[4735]: I1008 15:00:01.808326 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fd641231-7640-4c7b-8e6f-dfa24dc31650/glance-httpd/0.log" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.017933 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6fd9c587dd-m8gj4_0c5050eb-31e7-479f-8669-7c97bb659b55/heat-api/0.log" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.073157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" event={"ID":"5e660034-6529-42c4-971a-5851f66c7905","Type":"ContainerStarted","Data":"af5d0a81bae3a339e7e2b3b2c17c1348c4ab6202c6c6fb0554e3d7a168cfc755"} Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.073275 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" event={"ID":"5e660034-6529-42c4-971a-5851f66c7905","Type":"ContainerStarted","Data":"81f4c5872c16f11d6259839f372d100a9bb113df3b9bb588e4bdc56d2326750f"} Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.102227 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" podStartSLOduration=2.102194858 podStartE2EDuration="2.102194858s" podCreationTimestamp="2025-10-08 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 15:00:02.093160719 +0000 UTC m=+7451.901465273" watchObservedRunningTime="2025-10-08 15:00:02.102194858 +0000 UTC m=+7451.910499402" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.184133 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-68df5d85c7-wngwm_9f1aca0c-3f7f-4f0b-937f-56ba721f7f13/heat-cfnapi/0.log" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.324663 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5764f7c644-f9hgd_fa238d2c-c63c-43cb-9e0c-6387abad2e1a/heat-engine/0.log" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.525504 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b46fd549c-8kjz4_f5f29de7-461a-4ece-b951-e8842bc684a9/horizon/0.log" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.603696 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b46fd549c-8kjz4_f5f29de7-461a-4ece-b951-e8842bc684a9/horizon-log/0.log" Oct 08 15:00:02 crc kubenswrapper[4735]: I1008 15:00:02.887797 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9c68cdbbf-77rxj_8e627d7b-3491-4f1e-8257-df71e34472c2/keystone-api/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.083677 4735 generic.go:334] "Generic (PLEG): container finished" podID="5e660034-6529-42c4-971a-5851f66c7905" containerID="af5d0a81bae3a339e7e2b3b2c17c1348c4ab6202c6c6fb0554e3d7a168cfc755" exitCode=0 Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.083727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" event={"ID":"5e660034-6529-42c4-971a-5851f66c7905","Type":"ContainerDied","Data":"af5d0a81bae3a339e7e2b3b2c17c1348c4ab6202c6c6fb0554e3d7a168cfc755"} Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.273366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_65c37583-c56d-460a-a126-d01c64155288/kube-state-metrics/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.483602 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_39e936cd-50d2-4d95-850d-5ca4808b22fa/manila-api/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.503769 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_39e936cd-50d2-4d95-850d-5ca4808b22fa/manila-api-log/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.586141 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_036e2782-a3e8-4e6c-866a-adc689ba24dd/manila-scheduler/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.731643 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_036e2782-a3e8-4e6c-866a-adc689ba24dd/probe/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.880570 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8fdd3521-1c60-4f39-8274-132b679755c8/manila-share/0.log" Oct 08 15:00:03 crc kubenswrapper[4735]: I1008 15:00:03.999523 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8fdd3521-1c60-4f39-8274-132b679755c8/probe/0.log" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.134688 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_3c84d6ea-2c33-4074-b75c-eb824ba23e0e/adoption/0.log" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.595871 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.682800 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e660034-6529-42c4-971a-5851f66c7905-config-volume\") pod \"5e660034-6529-42c4-971a-5851f66c7905\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.683390 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krclr\" (UniqueName: \"kubernetes.io/projected/5e660034-6529-42c4-971a-5851f66c7905-kube-api-access-krclr\") pod \"5e660034-6529-42c4-971a-5851f66c7905\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.683456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e660034-6529-42c4-971a-5851f66c7905-secret-volume\") pod \"5e660034-6529-42c4-971a-5851f66c7905\" (UID: \"5e660034-6529-42c4-971a-5851f66c7905\") " Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.682870 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6"] Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.683769 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e660034-6529-42c4-971a-5851f66c7905-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e660034-6529-42c4-971a-5851f66c7905" (UID: "5e660034-6529-42c4-971a-5851f66c7905"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.684230 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e660034-6529-42c4-971a-5851f66c7905-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.688095 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58875664cc-zmhz6_b2e6f3a0-af7d-41a8-b515-df3df9059eb9/neutron-httpd/0.log" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.696688 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332215-vqvm6"] Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.704459 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58875664cc-zmhz6_b2e6f3a0-af7d-41a8-b515-df3df9059eb9/neutron-api/0.log" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.711523 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e660034-6529-42c4-971a-5851f66c7905-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e660034-6529-42c4-971a-5851f66c7905" (UID: "5e660034-6529-42c4-971a-5851f66c7905"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.716682 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e660034-6529-42c4-971a-5851f66c7905-kube-api-access-krclr" (OuterVolumeSpecName: "kube-api-access-krclr") pod "5e660034-6529-42c4-971a-5851f66c7905" (UID: "5e660034-6529-42c4-971a-5851f66c7905"). InnerVolumeSpecName "kube-api-access-krclr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.785797 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krclr\" (UniqueName: \"kubernetes.io/projected/5e660034-6529-42c4-971a-5851f66c7905-kube-api-access-krclr\") on node \"crc\" DevicePath \"\"" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.785826 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e660034-6529-42c4-971a-5851f66c7905-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 15:00:04 crc kubenswrapper[4735]: I1008 15:00:04.924732 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e167fc-9d02-4424-9f15-e49edef96765" path="/var/lib/kubelet/pods/25e167fc-9d02-4424-9f15-e49edef96765/volumes" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.056949 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a498d75-0f33-45a0-845f-5e0b659dd821/nova-api-api/0.log" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.107246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" event={"ID":"5e660034-6529-42c4-971a-5851f66c7905","Type":"ContainerDied","Data":"81f4c5872c16f11d6259839f372d100a9bb113df3b9bb588e4bdc56d2326750f"} Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.107284 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f4c5872c16f11d6259839f372d100a9bb113df3b9bb588e4bdc56d2326750f" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.107344 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332260-l9hld" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.151978 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a498d75-0f33-45a0-845f-5e0b659dd821/nova-api-log/0.log" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.482800 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a01d711d-4d74-4227-981c-1b44deaa2823/nova-cell0-conductor-conductor/0.log" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.545903 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e2c6229f-75a0-4a80-bcb4-74c95f06e34b/memcached/0.log" Oct 08 15:00:05 crc kubenswrapper[4735]: I1008 15:00:05.620645 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1824d43f-2acc-4235-a6fd-5a985f2ba4cc/nova-cell1-conductor-conductor/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.036704 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_799ff5fc-80ce-4ac8-8296-e133c2d2f1d7/nova-cell1-novncproxy-novncproxy/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.236632 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb09392d-77b7-4a05-99f7-dec7aa054361/nova-metadata-metadata/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.252378 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb09392d-77b7-4a05-99f7-dec7aa054361/nova-metadata-log/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.383562 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_17137a92-fc98-4065-b44d-ce076e9b69a6/nova-scheduler-scheduler/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.576160 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854f6bb796-5qdwc_0667386c-3ac2-4d41-9ead-7c87d40e8b1b/init/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.840875 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854f6bb796-5qdwc_0667386c-3ac2-4d41-9ead-7c87d40e8b1b/init/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.935079 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854f6bb796-5qdwc_0667386c-3ac2-4d41-9ead-7c87d40e8b1b/octavia-api-provider-agent/0.log" Oct 08 15:00:06 crc kubenswrapper[4735]: I1008 15:00:06.966412 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854f6bb796-5qdwc_0667386c-3ac2-4d41-9ead-7c87d40e8b1b/octavia-api/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.109032 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-wqg8h_009e57c9-9dec-4afb-b6b7-053082bfd574/init/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.344185 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-wqg8h_009e57c9-9dec-4afb-b6b7-053082bfd574/init/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.465735 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-wqg8h_009e57c9-9dec-4afb-b6b7-053082bfd574/octavia-healthmanager/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.487206 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-q84kl_e1d048d3-16c4-4e6f-8433-0680bff87bdc/init/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.881167 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-q84kl_e1d048d3-16c4-4e6f-8433-0680bff87bdc/init/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.968324 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-q84kl_e1d048d3-16c4-4e6f-8433-0680bff87bdc/octavia-housekeeping/0.log" Oct 08 15:00:07 crc kubenswrapper[4735]: I1008 15:00:07.978005 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-6nfqr_5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff/init/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.203624 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-6nfqr_5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff/octavia-amphora-httpd/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.263746 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-6nfqr_5a052b4b-df0e-4abe-a6ca-fee0a0c9bbff/init/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.309470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-zv4fl_6a639a26-f084-43c7-8a61-1f317337c3b1/init/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.629765 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-zv4fl_6a639a26-f084-43c7-8a61-1f317337c3b1/octavia-rsyslog/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.646892 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-zv4fl_6a639a26-f084-43c7-8a61-1f317337c3b1/init/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.671372 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-l7p4s_d768f095-c98f-4232-ac5d-2ecd4b479621/init/0.log" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.895644 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:00:08 crc kubenswrapper[4735]: E1008 15:00:08.896066 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:00:08 crc kubenswrapper[4735]: I1008 15:00:08.949302 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-l7p4s_d768f095-c98f-4232-ac5d-2ecd4b479621/init/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.030956 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-l7p4s_d768f095-c98f-4232-ac5d-2ecd4b479621/octavia-worker/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.047762 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03482ef3-1f39-445a-b172-9fa5b97d4b9e/mysql-bootstrap/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.302840 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03482ef3-1f39-445a-b172-9fa5b97d4b9e/galera/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.333892 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_79ca7617-d83c-4d45-9118-f8d7b71bc61a/mysql-bootstrap/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.362158 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03482ef3-1f39-445a-b172-9fa5b97d4b9e/mysql-bootstrap/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.605321 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_79ca7617-d83c-4d45-9118-f8d7b71bc61a/mysql-bootstrap/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.644658 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b55ff56c-fcf2-4ab2-83c9-3e60aa75d8a0/openstackclient/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.664284 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_79ca7617-d83c-4d45-9118-f8d7b71bc61a/galera/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.890050 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8jxgq_e962c965-3794-4e43-9e2d-a34b94810817/ovn-controller/0.log" Oct 08 15:00:09 crc kubenswrapper[4735]: I1008 15:00:09.945346 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d9b2b_242476e6-2f91-4031-aba8-9fe336a6f44a/openstack-network-exporter/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.096141 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z897z_7479271a-126c-460b-8275-f194ac2fff0f/ovsdb-server-init/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.334161 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z897z_7479271a-126c-460b-8275-f194ac2fff0f/ovs-vswitchd/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.398266 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z897z_7479271a-126c-460b-8275-f194ac2fff0f/ovsdb-server-init/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.428656 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z897z_7479271a-126c-460b-8275-f194ac2fff0f/ovsdb-server/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.572327 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_370761ea-f80e-4565-ac02-3c652949e54d/adoption/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.640598 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a2a26a15-814e-4c27-9d44-6d96a2eab8aa/openstack-network-exporter/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.750767 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a2a26a15-814e-4c27-9d44-6d96a2eab8aa/ovn-northd/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.846527 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_48cb53e0-bc62-4285-9aa3-c5e5958255bc/ovsdbserver-nb/0.log" Oct 08 15:00:10 crc kubenswrapper[4735]: I1008 15:00:10.948028 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_48cb53e0-bc62-4285-9aa3-c5e5958255bc/openstack-network-exporter/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.073248 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f7f8e7e7-4056-498c-b7c4-cedd842f264f/openstack-network-exporter/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.079634 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f7f8e7e7-4056-498c-b7c4-cedd842f264f/ovsdbserver-nb/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.282994 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d1792695-0298-4cc1-868e-4a3c2e7ca957/openstack-network-exporter/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.304883 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d1792695-0298-4cc1-868e-4a3c2e7ca957/ovsdbserver-nb/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.464433 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9b3c1ba5-f809-4ccb-b29c-5bb76169819f/openstack-network-exporter/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.491358 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9b3c1ba5-f809-4ccb-b29c-5bb76169819f/ovsdbserver-sb/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.604190 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_3ebbd73a-21ad-4baa-8fec-68e4ae05591b/openstack-network-exporter/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.678581 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_3ebbd73a-21ad-4baa-8fec-68e4ae05591b/ovsdbserver-sb/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.825461 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe/openstack-network-exporter/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.832661 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d2f2c46e-a1db-40c4-9dcc-6c0550c80cfe/ovsdbserver-sb/0.log" Oct 08 15:00:11 crc kubenswrapper[4735]: I1008 15:00:11.939258 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cf4d6ff98-g56r5_6490668b-e4ec-4dc2-900d-5d161f49c526/placement-api/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.338326 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cf4d6ff98-g56r5_6490668b-e4ec-4dc2-900d-5d161f49c526/placement-log/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.440268 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cq4pln_d1e38574-c345-4fb1-b368-0499eeea2f3f/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.556752 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_14478da8-1fcc-4ebe-96d2-3b3e6b9c9034/init-config-reloader/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.752076 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_14478da8-1fcc-4ebe-96d2-3b3e6b9c9034/config-reloader/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.772855 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_14478da8-1fcc-4ebe-96d2-3b3e6b9c9034/init-config-reloader/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.816397 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_14478da8-1fcc-4ebe-96d2-3b3e6b9c9034/thanos-sidecar/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.823327 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_14478da8-1fcc-4ebe-96d2-3b3e6b9c9034/prometheus/0.log" Oct 08 15:00:12 crc kubenswrapper[4735]: I1008 15:00:12.996817 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_55e7bd11-2cfa-4bee-98b9-32db543ab744/setup-container/0.log" Oct 08 15:00:13 crc kubenswrapper[4735]: I1008 15:00:13.151418 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_55e7bd11-2cfa-4bee-98b9-32db543ab744/setup-container/0.log" Oct 08 15:00:13 crc kubenswrapper[4735]: I1008 15:00:13.196192 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_792b2295-d7e4-41e9-b0c4-c789d057d8f0/setup-container/0.log" Oct 08 15:00:13 crc kubenswrapper[4735]: I1008 15:00:13.223771 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_55e7bd11-2cfa-4bee-98b9-32db543ab744/rabbitmq/0.log" Oct 08 15:00:13 crc kubenswrapper[4735]: I1008 15:00:13.530529 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_792b2295-d7e4-41e9-b0c4-c789d057d8f0/setup-container/0.log" Oct 08 15:00:13 crc kubenswrapper[4735]: I1008 15:00:13.627963 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-srlr4_7124bfbc-c048-4d7a-be43-049e5bd6b16b/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 08 15:00:13 crc kubenswrapper[4735]: I1008 15:00:13.738318 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_792b2295-d7e4-41e9-b0c4-c789d057d8f0/rabbitmq/0.log" Oct 08 15:00:15 crc kubenswrapper[4735]: I1008 15:00:15.144777 4735 scope.go:117] "RemoveContainer" containerID="a71dbeadb6891a8bf950a1384a01d172a4347b9ee91bec823dc65b92c5b13445" Oct 08 15:00:21 crc kubenswrapper[4735]: I1008 15:00:21.894599 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:00:21 crc kubenswrapper[4735]: E1008 15:00:21.895395 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:00:33 crc kubenswrapper[4735]: I1008 15:00:33.894949 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:00:33 crc kubenswrapper[4735]: E1008 15:00:33.895852 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:00:46 crc kubenswrapper[4735]: I1008 15:00:46.895958 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:00:46 crc kubenswrapper[4735]: E1008 15:00:46.896783 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:00:59 crc kubenswrapper[4735]: I1008 15:00:59.895299 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:00:59 crc kubenswrapper[4735]: E1008 15:00:59.896077 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.152949 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29332261-mbzsg"] Oct 08 15:01:00 crc kubenswrapper[4735]: E1008 15:01:00.153621 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e660034-6529-42c4-971a-5851f66c7905" containerName="collect-profiles" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.153658 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e660034-6529-42c4-971a-5851f66c7905" containerName="collect-profiles" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.153951 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e660034-6529-42c4-971a-5851f66c7905" containerName="collect-profiles" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.154963 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.175331 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29332261-mbzsg"] Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.308539 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-fernet-keys\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.308604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-combined-ca-bundle\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.308631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4mc\" (UniqueName: \"kubernetes.io/projected/cf497020-ab9f-4349-bfcf-703b6d7596f1-kube-api-access-cr4mc\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.308663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-config-data\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.412182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-fernet-keys\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.412274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-combined-ca-bundle\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.412305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4mc\" (UniqueName: \"kubernetes.io/projected/cf497020-ab9f-4349-bfcf-703b6d7596f1-kube-api-access-cr4mc\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.412468 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-config-data\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.419628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-fernet-keys\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.419703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-combined-ca-bundle\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.419725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-config-data\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.436943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4mc\" (UniqueName: \"kubernetes.io/projected/cf497020-ab9f-4349-bfcf-703b6d7596f1-kube-api-access-cr4mc\") pod \"keystone-cron-29332261-mbzsg\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.515998 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:00 crc kubenswrapper[4735]: I1008 15:01:00.990686 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29332261-mbzsg"] Oct 08 15:01:01 crc kubenswrapper[4735]: I1008 15:01:01.714767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332261-mbzsg" event={"ID":"cf497020-ab9f-4349-bfcf-703b6d7596f1","Type":"ContainerStarted","Data":"c6bf902c9eecb28c0b7ddfd1e2a79d5602fa3ab83968cf8e4705d8f724254f56"} Oct 08 15:01:01 crc kubenswrapper[4735]: I1008 15:01:01.715118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332261-mbzsg" event={"ID":"cf497020-ab9f-4349-bfcf-703b6d7596f1","Type":"ContainerStarted","Data":"5f7505efa2260a96b31001d097907ea5b26a7af5de0e4369957690ddeb01644c"} Oct 08 15:01:01 crc kubenswrapper[4735]: I1008 15:01:01.717751 4735 generic.go:334] "Generic (PLEG): container finished" podID="47fbf1b2-81fc-4660-8103-af767c2894a7" containerID="6b9fc99cd73cb707937434d01a89384145f8fdf14abd62ea8efe79b5853fbf62" exitCode=0 Oct 08 15:01:01 crc kubenswrapper[4735]: I1008 15:01:01.717781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" event={"ID":"47fbf1b2-81fc-4660-8103-af767c2894a7","Type":"ContainerDied","Data":"6b9fc99cd73cb707937434d01a89384145f8fdf14abd62ea8efe79b5853fbf62"} Oct 08 15:01:01 crc kubenswrapper[4735]: I1008 15:01:01.737293 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29332261-mbzsg" podStartSLOduration=1.737272352 podStartE2EDuration="1.737272352s" podCreationTimestamp="2025-10-08 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 15:01:01.72966852 +0000 UTC m=+7511.537973084" watchObservedRunningTime="2025-10-08 15:01:01.737272352 +0000 UTC m=+7511.545576896" Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.849898 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.893655 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-7vvwt"] Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.922510 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-7vvwt"] Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.975163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpz8q\" (UniqueName: \"kubernetes.io/projected/47fbf1b2-81fc-4660-8103-af767c2894a7-kube-api-access-hpz8q\") pod \"47fbf1b2-81fc-4660-8103-af767c2894a7\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.975424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47fbf1b2-81fc-4660-8103-af767c2894a7-host\") pod \"47fbf1b2-81fc-4660-8103-af767c2894a7\" (UID: \"47fbf1b2-81fc-4660-8103-af767c2894a7\") " Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.975633 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47fbf1b2-81fc-4660-8103-af767c2894a7-host" (OuterVolumeSpecName: "host") pod "47fbf1b2-81fc-4660-8103-af767c2894a7" (UID: "47fbf1b2-81fc-4660-8103-af767c2894a7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.975968 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47fbf1b2-81fc-4660-8103-af767c2894a7-host\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:02 crc kubenswrapper[4735]: I1008 15:01:02.989896 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47fbf1b2-81fc-4660-8103-af767c2894a7-kube-api-access-hpz8q" (OuterVolumeSpecName: "kube-api-access-hpz8q") pod "47fbf1b2-81fc-4660-8103-af767c2894a7" (UID: "47fbf1b2-81fc-4660-8103-af767c2894a7"). InnerVolumeSpecName "kube-api-access-hpz8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:01:03 crc kubenswrapper[4735]: I1008 15:01:03.079261 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpz8q\" (UniqueName: \"kubernetes.io/projected/47fbf1b2-81fc-4660-8103-af767c2894a7-kube-api-access-hpz8q\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:03 crc kubenswrapper[4735]: I1008 15:01:03.740652 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69787d73b3bbcccc2d8ebf44c88fdeded6b019d34d95053632db4cdbba59551" Oct 08 15:01:03 crc kubenswrapper[4735]: I1008 15:01:03.740963 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-7vvwt" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.095048 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-944m4"] Oct 08 15:01:04 crc kubenswrapper[4735]: E1008 15:01:04.096293 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fbf1b2-81fc-4660-8103-af767c2894a7" containerName="container-00" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.096314 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fbf1b2-81fc-4660-8103-af767c2894a7" containerName="container-00" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.096653 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="47fbf1b2-81fc-4660-8103-af767c2894a7" containerName="container-00" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.097915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.201852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2b52a02-df12-4d50-9419-54097b929331-host\") pod \"crc-debug-944m4\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.201972 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t28m\" (UniqueName: \"kubernetes.io/projected/a2b52a02-df12-4d50-9419-54097b929331-kube-api-access-4t28m\") pod \"crc-debug-944m4\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.304880 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2b52a02-df12-4d50-9419-54097b929331-host\") pod \"crc-debug-944m4\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.304979 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t28m\" (UniqueName: \"kubernetes.io/projected/a2b52a02-df12-4d50-9419-54097b929331-kube-api-access-4t28m\") pod \"crc-debug-944m4\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.305107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2b52a02-df12-4d50-9419-54097b929331-host\") pod \"crc-debug-944m4\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.326047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t28m\" (UniqueName: \"kubernetes.io/projected/a2b52a02-df12-4d50-9419-54097b929331-kube-api-access-4t28m\") pod \"crc-debug-944m4\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.422866 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:04 crc kubenswrapper[4735]: W1008 15:01:04.487707 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2b52a02_df12_4d50_9419_54097b929331.slice/crio-a84aac8f40aab048f11039c104c4a80b5a625acf3309eb175ff688dabd906c06 WatchSource:0}: Error finding container a84aac8f40aab048f11039c104c4a80b5a625acf3309eb175ff688dabd906c06: Status 404 returned error can't find the container with id a84aac8f40aab048f11039c104c4a80b5a625acf3309eb175ff688dabd906c06 Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.755828 4735 generic.go:334] "Generic (PLEG): container finished" podID="cf497020-ab9f-4349-bfcf-703b6d7596f1" containerID="c6bf902c9eecb28c0b7ddfd1e2a79d5602fa3ab83968cf8e4705d8f724254f56" exitCode=0 Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.755928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332261-mbzsg" event={"ID":"cf497020-ab9f-4349-bfcf-703b6d7596f1","Type":"ContainerDied","Data":"c6bf902c9eecb28c0b7ddfd1e2a79d5602fa3ab83968cf8e4705d8f724254f56"} Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.759152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-944m4" event={"ID":"a2b52a02-df12-4d50-9419-54097b929331","Type":"ContainerStarted","Data":"a84aac8f40aab048f11039c104c4a80b5a625acf3309eb175ff688dabd906c06"} Oct 08 15:01:04 crc kubenswrapper[4735]: I1008 15:01:04.908362 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47fbf1b2-81fc-4660-8103-af767c2894a7" path="/var/lib/kubelet/pods/47fbf1b2-81fc-4660-8103-af767c2894a7/volumes" Oct 08 15:01:05 crc kubenswrapper[4735]: I1008 15:01:05.770575 4735 generic.go:334] "Generic (PLEG): container finished" podID="a2b52a02-df12-4d50-9419-54097b929331" containerID="f1082ff8f57254f5ae1ba2188440b3f549999a8b5f5958b3380e41ec748394eb" exitCode=0 Oct 08 15:01:05 crc kubenswrapper[4735]: I1008 15:01:05.770634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-944m4" event={"ID":"a2b52a02-df12-4d50-9419-54097b929331","Type":"ContainerDied","Data":"f1082ff8f57254f5ae1ba2188440b3f549999a8b5f5958b3380e41ec748394eb"} Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.197557 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.352080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-combined-ca-bundle\") pod \"cf497020-ab9f-4349-bfcf-703b6d7596f1\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.352150 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4mc\" (UniqueName: \"kubernetes.io/projected/cf497020-ab9f-4349-bfcf-703b6d7596f1-kube-api-access-cr4mc\") pod \"cf497020-ab9f-4349-bfcf-703b6d7596f1\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.352251 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-fernet-keys\") pod \"cf497020-ab9f-4349-bfcf-703b6d7596f1\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.352276 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-config-data\") pod \"cf497020-ab9f-4349-bfcf-703b6d7596f1\" (UID: \"cf497020-ab9f-4349-bfcf-703b6d7596f1\") " Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.368924 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf497020-ab9f-4349-bfcf-703b6d7596f1" (UID: "cf497020-ab9f-4349-bfcf-703b6d7596f1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.371301 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf497020-ab9f-4349-bfcf-703b6d7596f1-kube-api-access-cr4mc" (OuterVolumeSpecName: "kube-api-access-cr4mc") pod "cf497020-ab9f-4349-bfcf-703b6d7596f1" (UID: "cf497020-ab9f-4349-bfcf-703b6d7596f1"). InnerVolumeSpecName "kube-api-access-cr4mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.396647 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf497020-ab9f-4349-bfcf-703b6d7596f1" (UID: "cf497020-ab9f-4349-bfcf-703b6d7596f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.434305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-config-data" (OuterVolumeSpecName: "config-data") pod "cf497020-ab9f-4349-bfcf-703b6d7596f1" (UID: "cf497020-ab9f-4349-bfcf-703b6d7596f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.454649 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.454694 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4mc\" (UniqueName: \"kubernetes.io/projected/cf497020-ab9f-4349-bfcf-703b6d7596f1-kube-api-access-cr4mc\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.454710 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.454721 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf497020-ab9f-4349-bfcf-703b6d7596f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.808734 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332261-mbzsg" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.808818 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332261-mbzsg" event={"ID":"cf497020-ab9f-4349-bfcf-703b6d7596f1","Type":"ContainerDied","Data":"5f7505efa2260a96b31001d097907ea5b26a7af5de0e4369957690ddeb01644c"} Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.808860 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f7505efa2260a96b31001d097907ea5b26a7af5de0e4369957690ddeb01644c" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.835857 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.965607 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2b52a02-df12-4d50-9419-54097b929331-host\") pod \"a2b52a02-df12-4d50-9419-54097b929331\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.966201 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2b52a02-df12-4d50-9419-54097b929331-host" (OuterVolumeSpecName: "host") pod "a2b52a02-df12-4d50-9419-54097b929331" (UID: "a2b52a02-df12-4d50-9419-54097b929331"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.967224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t28m\" (UniqueName: \"kubernetes.io/projected/a2b52a02-df12-4d50-9419-54097b929331-kube-api-access-4t28m\") pod \"a2b52a02-df12-4d50-9419-54097b929331\" (UID: \"a2b52a02-df12-4d50-9419-54097b929331\") " Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.968074 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2b52a02-df12-4d50-9419-54097b929331-host\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:06 crc kubenswrapper[4735]: I1008 15:01:06.970530 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b52a02-df12-4d50-9419-54097b929331-kube-api-access-4t28m" (OuterVolumeSpecName: "kube-api-access-4t28m") pod "a2b52a02-df12-4d50-9419-54097b929331" (UID: "a2b52a02-df12-4d50-9419-54097b929331"). InnerVolumeSpecName "kube-api-access-4t28m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:01:07 crc kubenswrapper[4735]: I1008 15:01:07.070277 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t28m\" (UniqueName: \"kubernetes.io/projected/a2b52a02-df12-4d50-9419-54097b929331-kube-api-access-4t28m\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:07 crc kubenswrapper[4735]: I1008 15:01:07.819252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-944m4" event={"ID":"a2b52a02-df12-4d50-9419-54097b929331","Type":"ContainerDied","Data":"a84aac8f40aab048f11039c104c4a80b5a625acf3309eb175ff688dabd906c06"} Oct 08 15:01:07 crc kubenswrapper[4735]: I1008 15:01:07.820126 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a84aac8f40aab048f11039c104c4a80b5a625acf3309eb175ff688dabd906c06" Oct 08 15:01:07 crc kubenswrapper[4735]: I1008 15:01:07.819326 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-944m4" Oct 08 15:01:10 crc kubenswrapper[4735]: I1008 15:01:10.900946 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:01:10 crc kubenswrapper[4735]: E1008 15:01:10.903136 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:01:15 crc kubenswrapper[4735]: I1008 15:01:15.581974 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-944m4"] Oct 08 15:01:15 crc kubenswrapper[4735]: I1008 15:01:15.591112 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-944m4"] Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.750443 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-p55x7"] Oct 08 15:01:16 crc kubenswrapper[4735]: E1008 15:01:16.751118 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf497020-ab9f-4349-bfcf-703b6d7596f1" containerName="keystone-cron" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.751130 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf497020-ab9f-4349-bfcf-703b6d7596f1" containerName="keystone-cron" Oct 08 15:01:16 crc kubenswrapper[4735]: E1008 15:01:16.751167 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b52a02-df12-4d50-9419-54097b929331" containerName="container-00" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.751174 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b52a02-df12-4d50-9419-54097b929331" containerName="container-00" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.751554 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b52a02-df12-4d50-9419-54097b929331" containerName="container-00" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.751614 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf497020-ab9f-4349-bfcf-703b6d7596f1" containerName="keystone-cron" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.752531 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.866301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445tq\" (UniqueName: \"kubernetes.io/projected/7d59bced-5f10-4613-89f5-a7c8740b8055-kube-api-access-445tq\") pod \"crc-debug-p55x7\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.866581 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d59bced-5f10-4613-89f5-a7c8740b8055-host\") pod \"crc-debug-p55x7\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.908008 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b52a02-df12-4d50-9419-54097b929331" path="/var/lib/kubelet/pods/a2b52a02-df12-4d50-9419-54097b929331/volumes" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.969010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-445tq\" (UniqueName: \"kubernetes.io/projected/7d59bced-5f10-4613-89f5-a7c8740b8055-kube-api-access-445tq\") pod \"crc-debug-p55x7\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.969236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d59bced-5f10-4613-89f5-a7c8740b8055-host\") pod \"crc-debug-p55x7\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.969443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d59bced-5f10-4613-89f5-a7c8740b8055-host\") pod \"crc-debug-p55x7\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:16 crc kubenswrapper[4735]: I1008 15:01:16.988877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-445tq\" (UniqueName: \"kubernetes.io/projected/7d59bced-5f10-4613-89f5-a7c8740b8055-kube-api-access-445tq\") pod \"crc-debug-p55x7\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:17 crc kubenswrapper[4735]: I1008 15:01:17.079506 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:17 crc kubenswrapper[4735]: I1008 15:01:17.931326 4735 generic.go:334] "Generic (PLEG): container finished" podID="7d59bced-5f10-4613-89f5-a7c8740b8055" containerID="af15eccf06db1d846dcb2adebf72fafdb10a1a74bf169320455899cb741cdbb4" exitCode=0 Oct 08 15:01:17 crc kubenswrapper[4735]: I1008 15:01:17.931443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-p55x7" event={"ID":"7d59bced-5f10-4613-89f5-a7c8740b8055","Type":"ContainerDied","Data":"af15eccf06db1d846dcb2adebf72fafdb10a1a74bf169320455899cb741cdbb4"} Oct 08 15:01:17 crc kubenswrapper[4735]: I1008 15:01:17.932603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/crc-debug-p55x7" event={"ID":"7d59bced-5f10-4613-89f5-a7c8740b8055","Type":"ContainerStarted","Data":"659d7abb25678c50e29fdc4c83cfb905436f0ef516a68b461b58191dd9faa767"} Oct 08 15:01:17 crc kubenswrapper[4735]: I1008 15:01:17.969829 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-p55x7"] Oct 08 15:01:17 crc kubenswrapper[4735]: I1008 15:01:17.987586 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xzdj/crc-debug-p55x7"] Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.083841 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.232539 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-445tq\" (UniqueName: \"kubernetes.io/projected/7d59bced-5f10-4613-89f5-a7c8740b8055-kube-api-access-445tq\") pod \"7d59bced-5f10-4613-89f5-a7c8740b8055\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.232635 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d59bced-5f10-4613-89f5-a7c8740b8055-host\") pod \"7d59bced-5f10-4613-89f5-a7c8740b8055\" (UID: \"7d59bced-5f10-4613-89f5-a7c8740b8055\") " Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.233377 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d59bced-5f10-4613-89f5-a7c8740b8055-host" (OuterVolumeSpecName: "host") pod "7d59bced-5f10-4613-89f5-a7c8740b8055" (UID: "7d59bced-5f10-4613-89f5-a7c8740b8055"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.248574 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d59bced-5f10-4613-89f5-a7c8740b8055-kube-api-access-445tq" (OuterVolumeSpecName: "kube-api-access-445tq") pod "7d59bced-5f10-4613-89f5-a7c8740b8055" (UID: "7d59bced-5f10-4613-89f5-a7c8740b8055"). InnerVolumeSpecName "kube-api-access-445tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.335407 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d59bced-5f10-4613-89f5-a7c8740b8055-host\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.335453 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-445tq\" (UniqueName: \"kubernetes.io/projected/7d59bced-5f10-4613-89f5-a7c8740b8055-kube-api-access-445tq\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.949740 4735 scope.go:117] "RemoveContainer" containerID="af15eccf06db1d846dcb2adebf72fafdb10a1a74bf169320455899cb741cdbb4" Oct 08 15:01:19 crc kubenswrapper[4735]: I1008 15:01:19.949805 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/crc-debug-p55x7" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.123920 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-wvpgj_4e11d5ed-b404-4449-88f2-428ee510aa3f/kube-rbac-proxy/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.254742 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-wvpgj_4e11d5ed-b404-4449-88f2-428ee510aa3f/manager/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.438097 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-9rbsk_b5933297-7fc6-41f5-9ac0-55e3ada0e45e/kube-rbac-proxy/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.497704 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-9rbsk_b5933297-7fc6-41f5-9ac0-55e3ada0e45e/manager/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.565340 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-bsvbv_0c811b08-74de-4fd3-a06b-5eb2021544a9/kube-rbac-proxy/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.629375 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-bsvbv_0c811b08-74de-4fd3-a06b-5eb2021544a9/manager/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.732037 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/util/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.906685 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d59bced-5f10-4613-89f5-a7c8740b8055" path="/var/lib/kubelet/pods/7d59bced-5f10-4613-89f5-a7c8740b8055/volumes" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.975146 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/pull/0.log" Oct 08 15:01:20 crc kubenswrapper[4735]: I1008 15:01:20.975202 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/pull/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.005954 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/util/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.154001 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/extract/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.173391 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/util/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.178903 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eee899de00471dca260c21f15d574ae705f89c0888bdad088fc990c0ecvjpjg_5ed0dc2a-17a6-41a5-9b0d-3694ac1240d9/pull/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.341252 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-rzjz4_0bdd70f2-2b72-4787-8434-1b3218cf5f68/kube-rbac-proxy/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.464883 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-98fgg_7899c8e4-5194-486e-b373-b27edda3e745/kube-rbac-proxy/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.489282 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-rzjz4_0bdd70f2-2b72-4787-8434-1b3218cf5f68/manager/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.588642 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-98fgg_7899c8e4-5194-486e-b373-b27edda3e745/manager/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.680744 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-vbvt8_38e7b3e3-9c02-4737-b500-7a9c01d5e92a/manager/0.log" Oct 08 15:01:21 crc kubenswrapper[4735]: I1008 15:01:21.698562 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-vbvt8_38e7b3e3-9c02-4737-b500-7a9c01d5e92a/kube-rbac-proxy/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.133933 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-xkw5z_1fc477a5-eda8-4dbd-87e8-83e4294edc0e/kube-rbac-proxy/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.335270 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-6jbp9_18146224-6500-499d-9fee-7d751700c432/kube-rbac-proxy/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.378902 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-6jbp9_18146224-6500-499d-9fee-7d751700c432/manager/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.418136 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-xkw5z_1fc477a5-eda8-4dbd-87e8-83e4294edc0e/manager/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.582554 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-wm2bm_ea14d44e-0e35-42d3-8618-cc49e1888e79/kube-rbac-proxy/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.771499 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-wm2bm_ea14d44e-0e35-42d3-8618-cc49e1888e79/manager/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.867090 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-r786m_a1edbfe6-4871-4748-a468-6c6fcd2db21a/kube-rbac-proxy/0.log" Oct 08 15:01:22 crc kubenswrapper[4735]: I1008 15:01:22.939833 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-r786m_a1edbfe6-4871-4748-a468-6c6fcd2db21a/manager/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.028696 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-522n9_9fc075dd-f4c6-4185-bed2-a27ff7e638e6/kube-rbac-proxy/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.164250 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-c5274_9a5fdef5-1d5b-40d0-9675-da52d2a07d80/kube-rbac-proxy/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.166430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-522n9_9fc075dd-f4c6-4185-bed2-a27ff7e638e6/manager/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.331906 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-c5274_9a5fdef5-1d5b-40d0-9675-da52d2a07d80/manager/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.392286 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-v29rs_c5d09b12-6911-4d37-8210-d067b6220b73/kube-rbac-proxy/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.609075 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-v29rs_c5d09b12-6911-4d37-8210-d067b6220b73/manager/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.673388 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-7k9wv_d7319151-d690-4595-9b86-72b7d5b812aa/kube-rbac-proxy/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.716583 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-7k9wv_d7319151-d690-4595-9b86-72b7d5b812aa/manager/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.797212 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-747747dfccm774j_ad73d925-823e-437b-ba07-9125f43b98b7/kube-rbac-proxy/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.832687 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-747747dfccm774j_ad73d925-823e-437b-ba07-9125f43b98b7/manager/0.log" Oct 08 15:01:23 crc kubenswrapper[4735]: I1008 15:01:23.906805 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8bc6b8f5b-2z94k_d2b6aa63-7896-4d74-aced-bd70743b04a6/kube-rbac-proxy/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.090792 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55f65988b-4qvmm_e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae/kube-rbac-proxy/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.314575 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55f65988b-4qvmm_e2cd19cd-0aba-4e79-b4a5-34da0d1f04ae/operator/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.353917 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79db49b9fb-gnl2m_46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a/kube-rbac-proxy/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.557286 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4pqs9_d18440b1-cd19-4186-ae4c-40d0ed7adf79/registry-server/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.666548 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-4vb9q_157b7f16-11b3-452f-8c03-287aeefb2631/kube-rbac-proxy/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.739276 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79db49b9fb-gnl2m_46d5fbd9-81ff-4a1e-96c7-01df9c2bd91a/manager/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.847987 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-4vb9q_157b7f16-11b3-452f-8c03-287aeefb2631/manager/0.log" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.901793 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:01:24 crc kubenswrapper[4735]: E1008 15:01:24.902461 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:01:24 crc kubenswrapper[4735]: I1008 15:01:24.966084 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-pqk7r_a0d2585e-cf4a-448e-8739-1591837c0781/operator/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.114547 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-rmpsl_31d49be4-108d-4004-973f-0fd539dd1037/kube-rbac-proxy/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.160993 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-rmpsl_31d49be4-108d-4004-973f-0fd539dd1037/manager/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.262433 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76796d4c6b-dmw5m_11418f7b-2467-49f3-ad5d-1786f70c023b/kube-rbac-proxy/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.535598 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56c698c775-zlnwj_8683ca58-b437-4d2d-aa45-13789ebc0cb4/manager/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.543351 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56c698c775-zlnwj_8683ca58-b437-4d2d-aa45-13789ebc0cb4/kube-rbac-proxy/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.628855 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76796d4c6b-dmw5m_11418f7b-2467-49f3-ad5d-1786f70c023b/manager/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.802825 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7794bc6bd-wfddw_7aa296d7-2605-498a-b618-5569accc5974/manager/0.log" Oct 08 15:01:25 crc kubenswrapper[4735]: I1008 15:01:25.866124 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7794bc6bd-wfddw_7aa296d7-2605-498a-b618-5569accc5974/kube-rbac-proxy/0.log" Oct 08 15:01:26 crc kubenswrapper[4735]: I1008 15:01:26.202715 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8bc6b8f5b-2z94k_d2b6aa63-7896-4d74-aced-bd70743b04a6/manager/0.log" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.149487 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h6hrq"] Oct 08 15:01:30 crc kubenswrapper[4735]: E1008 15:01:30.151809 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d59bced-5f10-4613-89f5-a7c8740b8055" containerName="container-00" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.151839 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d59bced-5f10-4613-89f5-a7c8740b8055" containerName="container-00" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.152092 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d59bced-5f10-4613-89f5-a7c8740b8055" containerName="container-00" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.156187 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.168003 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6hrq"] Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.286919 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5rx6\" (UniqueName: \"kubernetes.io/projected/675e1df9-b10d-4d55-b196-25d3f305ec5c-kube-api-access-d5rx6\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.287036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-catalog-content\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.287287 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-utilities\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.389118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5rx6\" (UniqueName: \"kubernetes.io/projected/675e1df9-b10d-4d55-b196-25d3f305ec5c-kube-api-access-d5rx6\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.389543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-catalog-content\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.390057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-catalog-content\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.391003 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-utilities\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.391336 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-utilities\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.411055 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5rx6\" (UniqueName: \"kubernetes.io/projected/675e1df9-b10d-4d55-b196-25d3f305ec5c-kube-api-access-d5rx6\") pod \"redhat-operators-h6hrq\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:30 crc kubenswrapper[4735]: I1008 15:01:30.486665 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:31 crc kubenswrapper[4735]: I1008 15:01:31.002727 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6hrq"] Oct 08 15:01:31 crc kubenswrapper[4735]: I1008 15:01:31.098379 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerStarted","Data":"782ee7c8ffb6a1e03026eae468bb4ed494040b4baa03b3d134a7c18e708402cc"} Oct 08 15:01:32 crc kubenswrapper[4735]: I1008 15:01:32.113062 4735 generic.go:334] "Generic (PLEG): container finished" podID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerID="10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9" exitCode=0 Oct 08 15:01:32 crc kubenswrapper[4735]: I1008 15:01:32.113400 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerDied","Data":"10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9"} Oct 08 15:01:32 crc kubenswrapper[4735]: I1008 15:01:32.131339 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 15:01:33 crc kubenswrapper[4735]: I1008 15:01:33.125036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerStarted","Data":"eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7"} Oct 08 15:01:36 crc kubenswrapper[4735]: I1008 15:01:36.160309 4735 generic.go:334] "Generic (PLEG): container finished" podID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerID="eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7" exitCode=0 Oct 08 15:01:36 crc kubenswrapper[4735]: I1008 15:01:36.160412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerDied","Data":"eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7"} Oct 08 15:01:37 crc kubenswrapper[4735]: I1008 15:01:37.172446 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerStarted","Data":"deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de"} Oct 08 15:01:37 crc kubenswrapper[4735]: I1008 15:01:37.192102 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h6hrq" podStartSLOduration=2.443405758 podStartE2EDuration="7.192083914s" podCreationTimestamp="2025-10-08 15:01:30 +0000 UTC" firstStartedPulling="2025-10-08 15:01:32.130869665 +0000 UTC m=+7541.939174219" lastFinishedPulling="2025-10-08 15:01:36.879547831 +0000 UTC m=+7546.687852375" observedRunningTime="2025-10-08 15:01:37.189380715 +0000 UTC m=+7546.997685279" watchObservedRunningTime="2025-10-08 15:01:37.192083914 +0000 UTC m=+7547.000388458" Oct 08 15:01:38 crc kubenswrapper[4735]: I1008 15:01:38.894760 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:01:38 crc kubenswrapper[4735]: E1008 15:01:38.895624 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:01:40 crc kubenswrapper[4735]: I1008 15:01:40.487374 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:40 crc kubenswrapper[4735]: I1008 15:01:40.487437 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:41 crc kubenswrapper[4735]: I1008 15:01:41.537674 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h6hrq" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="registry-server" probeResult="failure" output=< Oct 08 15:01:41 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 08 15:01:41 crc kubenswrapper[4735]: > Oct 08 15:01:42 crc kubenswrapper[4735]: I1008 15:01:42.549658 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zmvkr_bb722212-8f18-46b9-b5b5-484352c49bba/control-plane-machine-set-operator/0.log" Oct 08 15:01:42 crc kubenswrapper[4735]: I1008 15:01:42.722435 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xkrqh_99871c37-c1b7-48c4-bdec-d01234433109/kube-rbac-proxy/0.log" Oct 08 15:01:42 crc kubenswrapper[4735]: I1008 15:01:42.765041 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xkrqh_99871c37-c1b7-48c4-bdec-d01234433109/machine-api-operator/0.log" Oct 08 15:01:50 crc kubenswrapper[4735]: I1008 15:01:50.535191 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:50 crc kubenswrapper[4735]: I1008 15:01:50.599593 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:50 crc kubenswrapper[4735]: I1008 15:01:50.779994 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6hrq"] Oct 08 15:01:51 crc kubenswrapper[4735]: I1008 15:01:51.896100 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:01:51 crc kubenswrapper[4735]: E1008 15:01:51.896467 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.325820 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h6hrq" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="registry-server" containerID="cri-o://deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de" gracePeriod=2 Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.831983 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.880824 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5rx6\" (UniqueName: \"kubernetes.io/projected/675e1df9-b10d-4d55-b196-25d3f305ec5c-kube-api-access-d5rx6\") pod \"675e1df9-b10d-4d55-b196-25d3f305ec5c\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.880888 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-utilities\") pod \"675e1df9-b10d-4d55-b196-25d3f305ec5c\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.880977 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-catalog-content\") pod \"675e1df9-b10d-4d55-b196-25d3f305ec5c\" (UID: \"675e1df9-b10d-4d55-b196-25d3f305ec5c\") " Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.882830 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-utilities" (OuterVolumeSpecName: "utilities") pod "675e1df9-b10d-4d55-b196-25d3f305ec5c" (UID: "675e1df9-b10d-4d55-b196-25d3f305ec5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.887756 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675e1df9-b10d-4d55-b196-25d3f305ec5c-kube-api-access-d5rx6" (OuterVolumeSpecName: "kube-api-access-d5rx6") pod "675e1df9-b10d-4d55-b196-25d3f305ec5c" (UID: "675e1df9-b10d-4d55-b196-25d3f305ec5c"). InnerVolumeSpecName "kube-api-access-d5rx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.978631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "675e1df9-b10d-4d55-b196-25d3f305ec5c" (UID: "675e1df9-b10d-4d55-b196-25d3f305ec5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.984649 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5rx6\" (UniqueName: \"kubernetes.io/projected/675e1df9-b10d-4d55-b196-25d3f305ec5c-kube-api-access-d5rx6\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.984678 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:52 crc kubenswrapper[4735]: I1008 15:01:52.984688 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675e1df9-b10d-4d55-b196-25d3f305ec5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.338241 4735 generic.go:334] "Generic (PLEG): container finished" podID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerID="deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de" exitCode=0 Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.338333 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6hrq" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.338330 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerDied","Data":"deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de"} Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.339000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6hrq" event={"ID":"675e1df9-b10d-4d55-b196-25d3f305ec5c","Type":"ContainerDied","Data":"782ee7c8ffb6a1e03026eae468bb4ed494040b4baa03b3d134a7c18e708402cc"} Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.339088 4735 scope.go:117] "RemoveContainer" containerID="deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.360340 4735 scope.go:117] "RemoveContainer" containerID="eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.373507 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6hrq"] Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.384795 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h6hrq"] Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.393709 4735 scope.go:117] "RemoveContainer" containerID="10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.431179 4735 scope.go:117] "RemoveContainer" containerID="deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de" Oct 08 15:01:53 crc kubenswrapper[4735]: E1008 15:01:53.431812 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de\": container with ID starting with deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de not found: ID does not exist" containerID="deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.431856 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de"} err="failed to get container status \"deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de\": rpc error: code = NotFound desc = could not find container \"deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de\": container with ID starting with deb97359a5dc2caca308bf04c8c7bff46a404e737fc5905f184e911c8f2091de not found: ID does not exist" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.431885 4735 scope.go:117] "RemoveContainer" containerID="eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7" Oct 08 15:01:53 crc kubenswrapper[4735]: E1008 15:01:53.432960 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7\": container with ID starting with eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7 not found: ID does not exist" containerID="eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.433011 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7"} err="failed to get container status \"eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7\": rpc error: code = NotFound desc = could not find container \"eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7\": container with ID starting with eb1e7b7b57b12462912eeb7968078c056e0bbc03c088aa63ee26905e4d23ede7 not found: ID does not exist" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.433043 4735 scope.go:117] "RemoveContainer" containerID="10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9" Oct 08 15:01:53 crc kubenswrapper[4735]: E1008 15:01:53.433654 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9\": container with ID starting with 10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9 not found: ID does not exist" containerID="10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.433698 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9"} err="failed to get container status \"10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9\": rpc error: code = NotFound desc = could not find container \"10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9\": container with ID starting with 10a8fbd48de548c122706af11742e5d6ab72d505e6b29d5ed451a6c1e698def9 not found: ID does not exist" Oct 08 15:01:53 crc kubenswrapper[4735]: I1008 15:01:53.972024 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-fln6n_a70a475f-648d-4a84-baa5-b2c2f26c0567/cert-manager-controller/0.log" Oct 08 15:01:54 crc kubenswrapper[4735]: I1008 15:01:54.137880 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-4wpgw_367d355c-1c4f-465c-b88b-7de1cc0afd3e/cert-manager-cainjector/0.log" Oct 08 15:01:54 crc kubenswrapper[4735]: I1008 15:01:54.186799 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-8k9ww_448cb414-b6c4-41dd-b4c8-231e28be6fcb/cert-manager-webhook/0.log" Oct 08 15:01:54 crc kubenswrapper[4735]: I1008 15:01:54.908797 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" path="/var/lib/kubelet/pods/675e1df9-b10d-4d55-b196-25d3f305ec5c/volumes" Oct 08 15:02:03 crc kubenswrapper[4735]: I1008 15:02:03.894712 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:02:03 crc kubenswrapper[4735]: E1008 15:02:03.895644 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:02:05 crc kubenswrapper[4735]: I1008 15:02:05.135942 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-mz79v_98be5146-db56-4a21-9ffb-9875c6e07a0d/nmstate-console-plugin/0.log" Oct 08 15:02:05 crc kubenswrapper[4735]: I1008 15:02:05.325597 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dbz7p_6cc8c3dc-00ba-45e7-9d57-6ad40a580c16/nmstate-handler/0.log" Oct 08 15:02:05 crc kubenswrapper[4735]: I1008 15:02:05.375262 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-cjhw7_f34fd5b9-a3a6-47e7-8070-857df4befa39/kube-rbac-proxy/0.log" Oct 08 15:02:05 crc kubenswrapper[4735]: I1008 15:02:05.397326 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-cjhw7_f34fd5b9-a3a6-47e7-8070-857df4befa39/nmstate-metrics/0.log" Oct 08 15:02:05 crc kubenswrapper[4735]: I1008 15:02:05.582051 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-xvhhz_0751005e-8ad1-4fc8-81f8-c0e7074fccd3/nmstate-operator/0.log" Oct 08 15:02:05 crc kubenswrapper[4735]: I1008 15:02:05.690379 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-r9lhp_aac9fb9c-bf97-4090-b212-ea396b0443c4/nmstate-webhook/0.log" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.907599 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlnr"] Oct 08 15:02:14 crc kubenswrapper[4735]: E1008 15:02:14.908667 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="extract-utilities" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.908684 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="extract-utilities" Oct 08 15:02:14 crc kubenswrapper[4735]: E1008 15:02:14.908709 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="registry-server" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.908718 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="registry-server" Oct 08 15:02:14 crc kubenswrapper[4735]: E1008 15:02:14.908728 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="extract-content" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.908735 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="extract-content" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.909036 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="675e1df9-b10d-4d55-b196-25d3f305ec5c" containerName="registry-server" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.910987 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.935231 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlnr"] Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.948878 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-catalog-content\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.949316 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvnj\" (UniqueName: \"kubernetes.io/projected/49051cf9-ccce-4f3a-8629-4cd13d975acc-kube-api-access-2jvnj\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:14 crc kubenswrapper[4735]: I1008 15:02:14.949510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-utilities\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.051695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvnj\" (UniqueName: \"kubernetes.io/projected/49051cf9-ccce-4f3a-8629-4cd13d975acc-kube-api-access-2jvnj\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.051807 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-utilities\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.051914 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-catalog-content\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.052348 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-utilities\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.052375 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-catalog-content\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.075466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvnj\" (UniqueName: \"kubernetes.io/projected/49051cf9-ccce-4f3a-8629-4cd13d975acc-kube-api-access-2jvnj\") pod \"redhat-marketplace-6dlnr\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.237019 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:15 crc kubenswrapper[4735]: I1008 15:02:15.780076 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlnr"] Oct 08 15:02:16 crc kubenswrapper[4735]: I1008 15:02:16.607810 4735 generic.go:334] "Generic (PLEG): container finished" podID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerID="336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd" exitCode=0 Oct 08 15:02:16 crc kubenswrapper[4735]: I1008 15:02:16.607910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerDied","Data":"336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd"} Oct 08 15:02:16 crc kubenswrapper[4735]: I1008 15:02:16.608112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerStarted","Data":"06a992d1dad1325a6b544906ac34c57f248bb476d4cf6b3a44f6615caf8b2f36"} Oct 08 15:02:17 crc kubenswrapper[4735]: I1008 15:02:17.620569 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerStarted","Data":"1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7"} Oct 08 15:02:18 crc kubenswrapper[4735]: I1008 15:02:18.634785 4735 generic.go:334] "Generic (PLEG): container finished" podID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerID="1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7" exitCode=0 Oct 08 15:02:18 crc kubenswrapper[4735]: I1008 15:02:18.635077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerDied","Data":"1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7"} Oct 08 15:02:18 crc kubenswrapper[4735]: I1008 15:02:18.895325 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:02:18 crc kubenswrapper[4735]: E1008 15:02:18.895732 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:02:19 crc kubenswrapper[4735]: I1008 15:02:19.648930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerStarted","Data":"1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6"} Oct 08 15:02:19 crc kubenswrapper[4735]: I1008 15:02:19.673432 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dlnr" podStartSLOduration=3.196075465 podStartE2EDuration="5.673407732s" podCreationTimestamp="2025-10-08 15:02:14 +0000 UTC" firstStartedPulling="2025-10-08 15:02:16.609871644 +0000 UTC m=+7586.418176188" lastFinishedPulling="2025-10-08 15:02:19.087203911 +0000 UTC m=+7588.895508455" observedRunningTime="2025-10-08 15:02:19.667312158 +0000 UTC m=+7589.475616722" watchObservedRunningTime="2025-10-08 15:02:19.673407732 +0000 UTC m=+7589.481712276" Oct 08 15:02:21 crc kubenswrapper[4735]: I1008 15:02:21.544627 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6r8bv_54d730e4-2333-42b5-ac3f-fd36eaccf6c6/kube-rbac-proxy/0.log" Oct 08 15:02:21 crc kubenswrapper[4735]: I1008 15:02:21.805089 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-frr-files/0.log" Oct 08 15:02:21 crc kubenswrapper[4735]: I1008 15:02:21.891388 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6r8bv_54d730e4-2333-42b5-ac3f-fd36eaccf6c6/controller/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.029004 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-reloader/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.044411 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-frr-files/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.050033 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-metrics/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.099642 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-reloader/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.317396 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-metrics/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.319172 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-frr-files/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.355554 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-metrics/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.358429 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-reloader/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.555881 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-frr-files/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.567070 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-reloader/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.598727 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/controller/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.605393 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/cp-metrics/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.738366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/frr-metrics/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.792299 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/kube-rbac-proxy/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.844146 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/kube-rbac-proxy-frr/0.log" Oct 08 15:02:22 crc kubenswrapper[4735]: I1008 15:02:22.923118 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/reloader/0.log" Oct 08 15:02:23 crc kubenswrapper[4735]: I1008 15:02:23.218472 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-frhpw_89ab93e0-026b-4e98-a3d4-137f0d56accd/frr-k8s-webhook-server/0.log" Oct 08 15:02:23 crc kubenswrapper[4735]: I1008 15:02:23.304157 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-568b68fc89-tcglq_492d2d30-e201-405e-97aa-5457a763411d/manager/0.log" Oct 08 15:02:23 crc kubenswrapper[4735]: I1008 15:02:23.489992 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-85bdc78dfc-4j6pd_6646229c-d6d5-41ac-bcc2-22fcb3b2c737/webhook-server/0.log" Oct 08 15:02:23 crc kubenswrapper[4735]: I1008 15:02:23.637436 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cf2dv_ca9e15c8-5fa1-4b84-a47b-4662865b29ff/kube-rbac-proxy/0.log" Oct 08 15:02:24 crc kubenswrapper[4735]: I1008 15:02:24.666290 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cf2dv_ca9e15c8-5fa1-4b84-a47b-4662865b29ff/speaker/0.log" Oct 08 15:02:25 crc kubenswrapper[4735]: I1008 15:02:25.237130 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:25 crc kubenswrapper[4735]: I1008 15:02:25.237438 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:25 crc kubenswrapper[4735]: I1008 15:02:25.298438 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:25 crc kubenswrapper[4735]: I1008 15:02:25.767401 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:25 crc kubenswrapper[4735]: I1008 15:02:25.842469 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlnr"] Oct 08 15:02:25 crc kubenswrapper[4735]: I1008 15:02:25.846983 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2xst_da1954e4-438d-4fb3-a435-1fbf29da5382/frr/0.log" Oct 08 15:02:27 crc kubenswrapper[4735]: I1008 15:02:27.724018 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dlnr" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="registry-server" containerID="cri-o://1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6" gracePeriod=2 Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.227899 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.258580 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-utilities\") pod \"49051cf9-ccce-4f3a-8629-4cd13d975acc\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.258964 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jvnj\" (UniqueName: \"kubernetes.io/projected/49051cf9-ccce-4f3a-8629-4cd13d975acc-kube-api-access-2jvnj\") pod \"49051cf9-ccce-4f3a-8629-4cd13d975acc\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.259221 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-catalog-content\") pod \"49051cf9-ccce-4f3a-8629-4cd13d975acc\" (UID: \"49051cf9-ccce-4f3a-8629-4cd13d975acc\") " Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.266401 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49051cf9-ccce-4f3a-8629-4cd13d975acc-kube-api-access-2jvnj" (OuterVolumeSpecName: "kube-api-access-2jvnj") pod "49051cf9-ccce-4f3a-8629-4cd13d975acc" (UID: "49051cf9-ccce-4f3a-8629-4cd13d975acc"). InnerVolumeSpecName "kube-api-access-2jvnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.296237 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-utilities" (OuterVolumeSpecName: "utilities") pod "49051cf9-ccce-4f3a-8629-4cd13d975acc" (UID: "49051cf9-ccce-4f3a-8629-4cd13d975acc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.297901 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49051cf9-ccce-4f3a-8629-4cd13d975acc" (UID: "49051cf9-ccce-4f3a-8629-4cd13d975acc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.362302 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.362343 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jvnj\" (UniqueName: \"kubernetes.io/projected/49051cf9-ccce-4f3a-8629-4cd13d975acc-kube-api-access-2jvnj\") on node \"crc\" DevicePath \"\"" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.362355 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49051cf9-ccce-4f3a-8629-4cd13d975acc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.736228 4735 generic.go:334] "Generic (PLEG): container finished" podID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerID="1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6" exitCode=0 Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.736307 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlnr" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.736314 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerDied","Data":"1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6"} Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.736667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlnr" event={"ID":"49051cf9-ccce-4f3a-8629-4cd13d975acc","Type":"ContainerDied","Data":"06a992d1dad1325a6b544906ac34c57f248bb476d4cf6b3a44f6615caf8b2f36"} Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.736691 4735 scope.go:117] "RemoveContainer" containerID="1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.768335 4735 scope.go:117] "RemoveContainer" containerID="1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.769353 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlnr"] Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.784978 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlnr"] Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.788623 4735 scope.go:117] "RemoveContainer" containerID="336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.840605 4735 scope.go:117] "RemoveContainer" containerID="1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6" Oct 08 15:02:28 crc kubenswrapper[4735]: E1008 15:02:28.841106 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6\": container with ID starting with 1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6 not found: ID does not exist" containerID="1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.841148 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6"} err="failed to get container status \"1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6\": rpc error: code = NotFound desc = could not find container \"1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6\": container with ID starting with 1165c75da5499ee6f35cfd54ab577d776b6a281d70b03a741c360816b2105db6 not found: ID does not exist" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.841176 4735 scope.go:117] "RemoveContainer" containerID="1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7" Oct 08 15:02:28 crc kubenswrapper[4735]: E1008 15:02:28.841553 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7\": container with ID starting with 1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7 not found: ID does not exist" containerID="1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.841594 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7"} err="failed to get container status \"1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7\": rpc error: code = NotFound desc = could not find container \"1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7\": container with ID starting with 1041f8c294902327d3a8308b419be0c93aa0e2c8e1f21529a6d9d347994e92b7 not found: ID does not exist" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.841622 4735 scope.go:117] "RemoveContainer" containerID="336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd" Oct 08 15:02:28 crc kubenswrapper[4735]: E1008 15:02:28.841966 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd\": container with ID starting with 336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd not found: ID does not exist" containerID="336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.841999 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd"} err="failed to get container status \"336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd\": rpc error: code = NotFound desc = could not find container \"336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd\": container with ID starting with 336c4a91bae0b17a767409a5fc1ac5c8f0b1c2d0b559b81e400a3992888077fd not found: ID does not exist" Oct 08 15:02:28 crc kubenswrapper[4735]: I1008 15:02:28.908341 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" path="/var/lib/kubelet/pods/49051cf9-ccce-4f3a-8629-4cd13d975acc/volumes" Oct 08 15:02:33 crc kubenswrapper[4735]: I1008 15:02:33.895018 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:02:33 crc kubenswrapper[4735]: E1008 15:02:33.895722 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.287744 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/util/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.485812 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/pull/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.503081 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/util/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.527995 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/pull/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.703433 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/util/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.739751 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/pull/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.776204 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69zq75m_e3eae11a-5682-4826-9eca-ebc268c2114b/extract/0.log" Oct 08 15:02:36 crc kubenswrapper[4735]: I1008 15:02:36.884562 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/util/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.080601 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/util/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.107095 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/pull/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.119761 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/pull/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.274742 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/extract/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.293611 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/pull/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.329446 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2tf9cj_ea47ffd3-7063-48a5-82ca-0121e24a845e/util/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.468890 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/util/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.613289 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/util/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.643548 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/pull/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.668572 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/pull/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.826473 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/util/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.840514 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/pull/0.log" Oct 08 15:02:37 crc kubenswrapper[4735]: I1008 15:02:37.842903 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2lr2_3ad5f24f-e316-420c-a602-3272e3d8e0d1/extract/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.023747 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/extract-utilities/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.211975 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/extract-content/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.241619 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/extract-utilities/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.268195 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/extract-content/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.424918 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/extract-utilities/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.478247 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/extract-content/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.640117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/extract-utilities/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.848367 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/extract-content/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.916303 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc7hn_497e6679-c95b-41a8-bd50-cb845a1add47/registry-server/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.917796 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/extract-utilities/0.log" Oct 08 15:02:38 crc kubenswrapper[4735]: I1008 15:02:38.923321 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/extract-content/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.106025 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/extract-content/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.111287 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/extract-utilities/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.394015 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/util/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.605107 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/pull/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.633550 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/util/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.639054 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/pull/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.856893 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/util/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.887354 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/pull/0.log" Oct 08 15:02:39 crc kubenswrapper[4735]: I1008 15:02:39.973008 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cz59kl_5084e332-dc3f-46c3-8e9b-9c2c7795611a/extract/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.160514 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cvnfn_7932dd2c-9c74-4c9b-ab33-728a0169a580/marketplace-operator/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.247139 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/extract-utilities/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.469924 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/extract-utilities/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.493872 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/extract-content/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.507271 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/extract-content/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.536654 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rrx2j_fb750a1e-612a-4b33-b368-33c4a44c12e5/registry-server/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.638594 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/extract-utilities/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.660626 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/extract-content/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.724416 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/extract-utilities/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.935973 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/extract-content/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.982282 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/extract-utilities/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.983583 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kpvmg_eaa99f83-97af-4aa0-b05a-a65a86a75cb6/registry-server/0.log" Oct 08 15:02:40 crc kubenswrapper[4735]: I1008 15:02:40.984367 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/extract-content/0.log" Oct 08 15:02:41 crc kubenswrapper[4735]: I1008 15:02:41.156758 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/extract-content/0.log" Oct 08 15:02:41 crc kubenswrapper[4735]: I1008 15:02:41.196208 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/extract-utilities/0.log" Oct 08 15:02:41 crc kubenswrapper[4735]: I1008 15:02:41.630817 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4krmc_911cfef2-58be-47e0-bd73-c87f31296d38/registry-server/0.log" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.458247 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dcfnp"] Oct 08 15:02:42 crc kubenswrapper[4735]: E1008 15:02:42.458965 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="registry-server" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.458987 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="registry-server" Oct 08 15:02:42 crc kubenswrapper[4735]: E1008 15:02:42.459008 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="extract-utilities" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.459018 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="extract-utilities" Oct 08 15:02:42 crc kubenswrapper[4735]: E1008 15:02:42.459073 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="extract-content" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.459087 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="extract-content" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.459378 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="49051cf9-ccce-4f3a-8629-4cd13d975acc" containerName="registry-server" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.461436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.468493 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcfnp"] Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.543852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjh82\" (UniqueName: \"kubernetes.io/projected/f7fc2961-b983-4059-82ce-57957a9e9c3a-kube-api-access-bjh82\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.544589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-utilities\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.545044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-catalog-content\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.647191 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-catalog-content\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.647318 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjh82\" (UniqueName: \"kubernetes.io/projected/f7fc2961-b983-4059-82ce-57957a9e9c3a-kube-api-access-bjh82\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.647423 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-utilities\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.647986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-catalog-content\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.648057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-utilities\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.668266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjh82\" (UniqueName: \"kubernetes.io/projected/f7fc2961-b983-4059-82ce-57957a9e9c3a-kube-api-access-bjh82\") pod \"certified-operators-dcfnp\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:42 crc kubenswrapper[4735]: I1008 15:02:42.797870 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:43 crc kubenswrapper[4735]: I1008 15:02:43.340470 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcfnp"] Oct 08 15:02:43 crc kubenswrapper[4735]: I1008 15:02:43.881385 4735 generic.go:334] "Generic (PLEG): container finished" podID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerID="04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe" exitCode=0 Oct 08 15:02:43 crc kubenswrapper[4735]: I1008 15:02:43.881514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerDied","Data":"04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe"} Oct 08 15:02:43 crc kubenswrapper[4735]: I1008 15:02:43.881721 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerStarted","Data":"05a1f26aa6dd306eaceeec7aeb05054859f95c74919e8b478891d2af3137f23a"} Oct 08 15:02:44 crc kubenswrapper[4735]: I1008 15:02:44.907262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerStarted","Data":"66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304"} Oct 08 15:02:46 crc kubenswrapper[4735]: I1008 15:02:46.914728 4735 generic.go:334] "Generic (PLEG): container finished" podID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerID="66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304" exitCode=0 Oct 08 15:02:46 crc kubenswrapper[4735]: I1008 15:02:46.914817 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerDied","Data":"66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304"} Oct 08 15:02:47 crc kubenswrapper[4735]: I1008 15:02:47.894803 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:02:47 crc kubenswrapper[4735]: E1008 15:02:47.895873 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:02:47 crc kubenswrapper[4735]: I1008 15:02:47.927786 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerStarted","Data":"a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f"} Oct 08 15:02:47 crc kubenswrapper[4735]: I1008 15:02:47.949151 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dcfnp" podStartSLOduration=2.52063636 podStartE2EDuration="5.949132576s" podCreationTimestamp="2025-10-08 15:02:42 +0000 UTC" firstStartedPulling="2025-10-08 15:02:43.883132301 +0000 UTC m=+7613.691436845" lastFinishedPulling="2025-10-08 15:02:47.311628517 +0000 UTC m=+7617.119933061" observedRunningTime="2025-10-08 15:02:47.943260358 +0000 UTC m=+7617.751564922" watchObservedRunningTime="2025-10-08 15:02:47.949132576 +0000 UTC m=+7617.757437120" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.610688 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-6sklc_df4c8416-1970-48fb-a03f-61fc1addc8f7/prometheus-operator/0.log" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.772951 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5cb54566d4-jbjnj_41a64cf2-ae40-4e4b-a8d3-6e18f9207a16/prometheus-operator-admission-webhook/0.log" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.797995 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.798122 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.806915 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5cb54566d4-7djt5_daa71edb-4564-4793-bfc0-8499e44e2b37/prometheus-operator-admission-webhook/0.log" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.846653 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:52 crc kubenswrapper[4735]: I1008 15:02:52.984934 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-68pfv_ec3f9923-3121-4608-8145-ba70fad25a5e/operator/0.log" Oct 08 15:02:53 crc kubenswrapper[4735]: I1008 15:02:53.010291 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-kc8w9_eeb8e4fe-df75-46aa-9f6a-6760a53df37b/perses-operator/0.log" Oct 08 15:02:53 crc kubenswrapper[4735]: I1008 15:02:53.040807 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:53 crc kubenswrapper[4735]: I1008 15:02:53.089985 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcfnp"] Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.009703 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dcfnp" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="registry-server" containerID="cri-o://a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f" gracePeriod=2 Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.542327 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.726690 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-utilities\") pod \"f7fc2961-b983-4059-82ce-57957a9e9c3a\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.727738 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-utilities" (OuterVolumeSpecName: "utilities") pod "f7fc2961-b983-4059-82ce-57957a9e9c3a" (UID: "f7fc2961-b983-4059-82ce-57957a9e9c3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.728227 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-catalog-content\") pod \"f7fc2961-b983-4059-82ce-57957a9e9c3a\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.729691 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjh82\" (UniqueName: \"kubernetes.io/projected/f7fc2961-b983-4059-82ce-57957a9e9c3a-kube-api-access-bjh82\") pod \"f7fc2961-b983-4059-82ce-57957a9e9c3a\" (UID: \"f7fc2961-b983-4059-82ce-57957a9e9c3a\") " Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.730266 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.735469 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fc2961-b983-4059-82ce-57957a9e9c3a-kube-api-access-bjh82" (OuterVolumeSpecName: "kube-api-access-bjh82") pod "f7fc2961-b983-4059-82ce-57957a9e9c3a" (UID: "f7fc2961-b983-4059-82ce-57957a9e9c3a"). InnerVolumeSpecName "kube-api-access-bjh82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.832134 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjh82\" (UniqueName: \"kubernetes.io/projected/f7fc2961-b983-4059-82ce-57957a9e9c3a-kube-api-access-bjh82\") on node \"crc\" DevicePath \"\"" Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.891889 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7fc2961-b983-4059-82ce-57957a9e9c3a" (UID: "f7fc2961-b983-4059-82ce-57957a9e9c3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:02:55 crc kubenswrapper[4735]: I1008 15:02:55.934274 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7fc2961-b983-4059-82ce-57957a9e9c3a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.023310 4735 generic.go:334] "Generic (PLEG): container finished" podID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerID="a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f" exitCode=0 Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.023364 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcfnp" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.023389 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerDied","Data":"a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f"} Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.024715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcfnp" event={"ID":"f7fc2961-b983-4059-82ce-57957a9e9c3a","Type":"ContainerDied","Data":"05a1f26aa6dd306eaceeec7aeb05054859f95c74919e8b478891d2af3137f23a"} Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.024737 4735 scope.go:117] "RemoveContainer" containerID="a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.056896 4735 scope.go:117] "RemoveContainer" containerID="66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.080490 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcfnp"] Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.091234 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dcfnp"] Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.097795 4735 scope.go:117] "RemoveContainer" containerID="04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.161704 4735 scope.go:117] "RemoveContainer" containerID="a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f" Oct 08 15:02:56 crc kubenswrapper[4735]: E1008 15:02:56.166116 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f\": container with ID starting with a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f not found: ID does not exist" containerID="a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.166189 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f"} err="failed to get container status \"a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f\": rpc error: code = NotFound desc = could not find container \"a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f\": container with ID starting with a1e039573d0917a3f450a88c755de5819b9c2149b5a11abbf23d398330dfb56f not found: ID does not exist" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.166227 4735 scope.go:117] "RemoveContainer" containerID="66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304" Oct 08 15:02:56 crc kubenswrapper[4735]: E1008 15:02:56.167943 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304\": container with ID starting with 66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304 not found: ID does not exist" containerID="66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.168003 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304"} err="failed to get container status \"66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304\": rpc error: code = NotFound desc = could not find container \"66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304\": container with ID starting with 66c0c435a2afcd9e48a7e636278d2a64abe19ceeb7f29ace0e66c12b111c7304 not found: ID does not exist" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.168039 4735 scope.go:117] "RemoveContainer" containerID="04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe" Oct 08 15:02:56 crc kubenswrapper[4735]: E1008 15:02:56.169049 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe\": container with ID starting with 04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe not found: ID does not exist" containerID="04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.169082 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe"} err="failed to get container status \"04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe\": rpc error: code = NotFound desc = could not find container \"04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe\": container with ID starting with 04e0773f7510926836f11d56cc749f48de869e1d8287f11f1239adca1aa45ebe not found: ID does not exist" Oct 08 15:02:56 crc kubenswrapper[4735]: I1008 15:02:56.909158 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" path="/var/lib/kubelet/pods/f7fc2961-b983-4059-82ce-57957a9e9c3a/volumes" Oct 08 15:03:02 crc kubenswrapper[4735]: I1008 15:03:02.895086 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:03:02 crc kubenswrapper[4735]: E1008 15:03:02.896038 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:03:17 crc kubenswrapper[4735]: I1008 15:03:17.895371 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:03:17 crc kubenswrapper[4735]: E1008 15:03:17.896959 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:03:23 crc kubenswrapper[4735]: E1008 15:03:23.617328 4735 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.194:47490->38.102.83.194:36769: write tcp 38.102.83.194:47490->38.102.83.194:36769: write: broken pipe Oct 08 15:03:31 crc kubenswrapper[4735]: I1008 15:03:31.894805 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:03:31 crc kubenswrapper[4735]: E1008 15:03:31.895723 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:03:45 crc kubenswrapper[4735]: I1008 15:03:45.895590 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:03:45 crc kubenswrapper[4735]: E1008 15:03:45.896407 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:03:58 crc kubenswrapper[4735]: I1008 15:03:58.895375 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:03:58 crc kubenswrapper[4735]: E1008 15:03:58.896118 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:04:09 crc kubenswrapper[4735]: I1008 15:04:09.896377 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:04:09 crc kubenswrapper[4735]: E1008 15:04:09.897213 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:04:22 crc kubenswrapper[4735]: I1008 15:04:22.895390 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:04:22 crc kubenswrapper[4735]: E1008 15:04:22.896175 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:04:36 crc kubenswrapper[4735]: I1008 15:04:36.895781 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:04:36 crc kubenswrapper[4735]: E1008 15:04:36.897130 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:04:48 crc kubenswrapper[4735]: I1008 15:04:48.895231 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:04:48 crc kubenswrapper[4735]: E1008 15:04:48.896018 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9qwgd_openshift-machine-config-operator(2dfd0474-f64a-4d3b-9990-df182d079b10)\"" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" Oct 08 15:05:00 crc kubenswrapper[4735]: I1008 15:05:00.901942 4735 scope.go:117] "RemoveContainer" containerID="584da6e7d0cb04e94bd7ca6ddf379929301c09a4661e8cd1fc0a739f5f2554cb" Oct 08 15:05:01 crc kubenswrapper[4735]: I1008 15:05:01.312869 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" event={"ID":"2dfd0474-f64a-4d3b-9990-df182d079b10","Type":"ContainerStarted","Data":"1b1b025f3dc38738f281bda84d37af3ba1978202b7e9407225b015c177308724"} Oct 08 15:05:13 crc kubenswrapper[4735]: I1008 15:05:13.430649 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerID="dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569" exitCode=0 Oct 08 15:05:13 crc kubenswrapper[4735]: I1008 15:05:13.430723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xzdj/must-gather-wstqn" event={"ID":"e0b71b21-6e14-41d1-97c1-3dc43e232014","Type":"ContainerDied","Data":"dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569"} Oct 08 15:05:13 crc kubenswrapper[4735]: I1008 15:05:13.431856 4735 scope.go:117] "RemoveContainer" containerID="dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569" Oct 08 15:05:14 crc kubenswrapper[4735]: I1008 15:05:14.367314 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xzdj_must-gather-wstqn_e0b71b21-6e14-41d1-97c1-3dc43e232014/gather/0.log" Oct 08 15:05:15 crc kubenswrapper[4735]: I1008 15:05:15.389749 4735 scope.go:117] "RemoveContainer" containerID="6b9fc99cd73cb707937434d01a89384145f8fdf14abd62ea8efe79b5853fbf62" Oct 08 15:05:22 crc kubenswrapper[4735]: I1008 15:05:22.632799 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xzdj/must-gather-wstqn"] Oct 08 15:05:22 crc kubenswrapper[4735]: I1008 15:05:22.633909 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9xzdj/must-gather-wstqn" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="copy" containerID="cri-o://ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b" gracePeriod=2 Oct 08 15:05:22 crc kubenswrapper[4735]: I1008 15:05:22.643524 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xzdj/must-gather-wstqn"] Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.090747 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xzdj_must-gather-wstqn_e0b71b21-6e14-41d1-97c1-3dc43e232014/copy/0.log" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.091585 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.222840 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0b71b21-6e14-41d1-97c1-3dc43e232014-must-gather-output\") pod \"e0b71b21-6e14-41d1-97c1-3dc43e232014\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.222922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4nbh\" (UniqueName: \"kubernetes.io/projected/e0b71b21-6e14-41d1-97c1-3dc43e232014-kube-api-access-g4nbh\") pod \"e0b71b21-6e14-41d1-97c1-3dc43e232014\" (UID: \"e0b71b21-6e14-41d1-97c1-3dc43e232014\") " Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.228807 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b71b21-6e14-41d1-97c1-3dc43e232014-kube-api-access-g4nbh" (OuterVolumeSpecName: "kube-api-access-g4nbh") pod "e0b71b21-6e14-41d1-97c1-3dc43e232014" (UID: "e0b71b21-6e14-41d1-97c1-3dc43e232014"). InnerVolumeSpecName "kube-api-access-g4nbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.326023 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4nbh\" (UniqueName: \"kubernetes.io/projected/e0b71b21-6e14-41d1-97c1-3dc43e232014-kube-api-access-g4nbh\") on node \"crc\" DevicePath \"\"" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.387707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b71b21-6e14-41d1-97c1-3dc43e232014-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e0b71b21-6e14-41d1-97c1-3dc43e232014" (UID: "e0b71b21-6e14-41d1-97c1-3dc43e232014"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.427525 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0b71b21-6e14-41d1-97c1-3dc43e232014-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.550308 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xzdj_must-gather-wstqn_e0b71b21-6e14-41d1-97c1-3dc43e232014/copy/0.log" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.550837 4735 generic.go:334] "Generic (PLEG): container finished" podID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerID="ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b" exitCode=143 Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.550891 4735 scope.go:117] "RemoveContainer" containerID="ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.550919 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xzdj/must-gather-wstqn" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.578729 4735 scope.go:117] "RemoveContainer" containerID="dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.646224 4735 scope.go:117] "RemoveContainer" containerID="ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b" Oct 08 15:05:23 crc kubenswrapper[4735]: E1008 15:05:23.646631 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b\": container with ID starting with ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b not found: ID does not exist" containerID="ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.646661 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b"} err="failed to get container status \"ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b\": rpc error: code = NotFound desc = could not find container \"ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b\": container with ID starting with ead185b1fdb8aefde2f94903677c7b436300060dc4e0e0df2519c50a22ff0c6b not found: ID does not exist" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.646683 4735 scope.go:117] "RemoveContainer" containerID="dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569" Oct 08 15:05:23 crc kubenswrapper[4735]: E1008 15:05:23.646956 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569\": container with ID starting with dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569 not found: ID does not exist" containerID="dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569" Oct 08 15:05:23 crc kubenswrapper[4735]: I1008 15:05:23.646986 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569"} err="failed to get container status \"dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569\": rpc error: code = NotFound desc = could not find container \"dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569\": container with ID starting with dd06f1c3e37921aebd80e206c6c1a9b582eebc54a08bc3eb3c329a943b964569 not found: ID does not exist" Oct 08 15:05:24 crc kubenswrapper[4735]: I1008 15:05:24.910578 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" path="/var/lib/kubelet/pods/e0b71b21-6e14-41d1-97c1-3dc43e232014/volumes" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.852008 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8mxz5"] Oct 08 15:05:44 crc kubenswrapper[4735]: E1008 15:05:44.852899 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="registry-server" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.852912 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="registry-server" Oct 08 15:05:44 crc kubenswrapper[4735]: E1008 15:05:44.852928 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="copy" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.852933 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="copy" Oct 08 15:05:44 crc kubenswrapper[4735]: E1008 15:05:44.852941 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="extract-content" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.852949 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="extract-content" Oct 08 15:05:44 crc kubenswrapper[4735]: E1008 15:05:44.852957 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="extract-utilities" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.852963 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="extract-utilities" Oct 08 15:05:44 crc kubenswrapper[4735]: E1008 15:05:44.853006 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="gather" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.853012 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="gather" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.853200 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="gather" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.853238 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b71b21-6e14-41d1-97c1-3dc43e232014" containerName="copy" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.853258 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7fc2961-b983-4059-82ce-57957a9e9c3a" containerName="registry-server" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.855390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.865232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-catalog-content\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.865420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpv9h\" (UniqueName: \"kubernetes.io/projected/eabf4616-2e6c-412c-b059-329195b10ee7-kube-api-access-jpv9h\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.865622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-utilities\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.873818 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8mxz5"] Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.970547 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-catalog-content\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.970942 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpv9h\" (UniqueName: \"kubernetes.io/projected/eabf4616-2e6c-412c-b059-329195b10ee7-kube-api-access-jpv9h\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.971010 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-utilities\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.971130 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-catalog-content\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.971412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-utilities\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:44 crc kubenswrapper[4735]: I1008 15:05:44.991241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpv9h\" (UniqueName: \"kubernetes.io/projected/eabf4616-2e6c-412c-b059-329195b10ee7-kube-api-access-jpv9h\") pod \"community-operators-8mxz5\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:45 crc kubenswrapper[4735]: I1008 15:05:45.180563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:45 crc kubenswrapper[4735]: I1008 15:05:45.800800 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8mxz5"] Oct 08 15:05:46 crc kubenswrapper[4735]: I1008 15:05:46.779627 4735 generic.go:334] "Generic (PLEG): container finished" podID="eabf4616-2e6c-412c-b059-329195b10ee7" containerID="b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49" exitCode=0 Oct 08 15:05:46 crc kubenswrapper[4735]: I1008 15:05:46.779849 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerDied","Data":"b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49"} Oct 08 15:05:46 crc kubenswrapper[4735]: I1008 15:05:46.780231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerStarted","Data":"f8c4371fed6665b461aab499698a3903d03b27830b7d037c3743c7ff798065dd"} Oct 08 15:05:47 crc kubenswrapper[4735]: I1008 15:05:47.795241 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerStarted","Data":"0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78"} Oct 08 15:05:48 crc kubenswrapper[4735]: I1008 15:05:48.805784 4735 generic.go:334] "Generic (PLEG): container finished" podID="eabf4616-2e6c-412c-b059-329195b10ee7" containerID="0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78" exitCode=0 Oct 08 15:05:48 crc kubenswrapper[4735]: I1008 15:05:48.805862 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerDied","Data":"0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78"} Oct 08 15:05:49 crc kubenswrapper[4735]: I1008 15:05:49.822279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerStarted","Data":"6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e"} Oct 08 15:05:49 crc kubenswrapper[4735]: I1008 15:05:49.846891 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8mxz5" podStartSLOduration=3.30215786 podStartE2EDuration="5.846866672s" podCreationTimestamp="2025-10-08 15:05:44 +0000 UTC" firstStartedPulling="2025-10-08 15:05:46.78319727 +0000 UTC m=+7796.591501814" lastFinishedPulling="2025-10-08 15:05:49.327906082 +0000 UTC m=+7799.136210626" observedRunningTime="2025-10-08 15:05:49.841972168 +0000 UTC m=+7799.650276712" watchObservedRunningTime="2025-10-08 15:05:49.846866672 +0000 UTC m=+7799.655171216" Oct 08 15:05:55 crc kubenswrapper[4735]: I1008 15:05:55.181432 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:55 crc kubenswrapper[4735]: I1008 15:05:55.181990 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:55 crc kubenswrapper[4735]: I1008 15:05:55.232293 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:55 crc kubenswrapper[4735]: I1008 15:05:55.922505 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:55 crc kubenswrapper[4735]: I1008 15:05:55.996201 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8mxz5"] Oct 08 15:05:57 crc kubenswrapper[4735]: I1008 15:05:57.888215 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8mxz5" podUID="eabf4616-2e6c-412c-b059-329195b10ee7" containerName="registry-server" containerID="cri-o://6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e" gracePeriod=2 Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.449570 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.575422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-catalog-content\") pod \"eabf4616-2e6c-412c-b059-329195b10ee7\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.575870 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-utilities\") pod \"eabf4616-2e6c-412c-b059-329195b10ee7\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.575933 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpv9h\" (UniqueName: \"kubernetes.io/projected/eabf4616-2e6c-412c-b059-329195b10ee7-kube-api-access-jpv9h\") pod \"eabf4616-2e6c-412c-b059-329195b10ee7\" (UID: \"eabf4616-2e6c-412c-b059-329195b10ee7\") " Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.578340 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-utilities" (OuterVolumeSpecName: "utilities") pod "eabf4616-2e6c-412c-b059-329195b10ee7" (UID: "eabf4616-2e6c-412c-b059-329195b10ee7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.583143 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eabf4616-2e6c-412c-b059-329195b10ee7-kube-api-access-jpv9h" (OuterVolumeSpecName: "kube-api-access-jpv9h") pod "eabf4616-2e6c-412c-b059-329195b10ee7" (UID: "eabf4616-2e6c-412c-b059-329195b10ee7"). InnerVolumeSpecName "kube-api-access-jpv9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.678927 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.678962 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpv9h\" (UniqueName: \"kubernetes.io/projected/eabf4616-2e6c-412c-b059-329195b10ee7-kube-api-access-jpv9h\") on node \"crc\" DevicePath \"\"" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.901789 4735 generic.go:334] "Generic (PLEG): container finished" podID="eabf4616-2e6c-412c-b059-329195b10ee7" containerID="6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e" exitCode=0 Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.901876 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8mxz5" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.921911 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerDied","Data":"6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e"} Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.922017 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8mxz5" event={"ID":"eabf4616-2e6c-412c-b059-329195b10ee7","Type":"ContainerDied","Data":"f8c4371fed6665b461aab499698a3903d03b27830b7d037c3743c7ff798065dd"} Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.922039 4735 scope.go:117] "RemoveContainer" containerID="6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.948314 4735 scope.go:117] "RemoveContainer" containerID="0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.951321 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eabf4616-2e6c-412c-b059-329195b10ee7" (UID: "eabf4616-2e6c-412c-b059-329195b10ee7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.971427 4735 scope.go:117] "RemoveContainer" containerID="b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49" Oct 08 15:05:58 crc kubenswrapper[4735]: I1008 15:05:58.985460 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabf4616-2e6c-412c-b059-329195b10ee7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.021032 4735 scope.go:117] "RemoveContainer" containerID="6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e" Oct 08 15:05:59 crc kubenswrapper[4735]: E1008 15:05:59.021570 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e\": container with ID starting with 6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e not found: ID does not exist" containerID="6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.021611 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e"} err="failed to get container status \"6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e\": rpc error: code = NotFound desc = could not find container \"6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e\": container with ID starting with 6a5522d5c8c1e59e630d137dca8916addeee66738201e9a2e8a29190710f105e not found: ID does not exist" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.021636 4735 scope.go:117] "RemoveContainer" containerID="0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78" Oct 08 15:05:59 crc kubenswrapper[4735]: E1008 15:05:59.021955 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78\": container with ID starting with 0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78 not found: ID does not exist" containerID="0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.021990 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78"} err="failed to get container status \"0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78\": rpc error: code = NotFound desc = could not find container \"0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78\": container with ID starting with 0e054024fa3525e4ff27ecbbc2e2f2e1215547b01562201171ea9a520836cc78 not found: ID does not exist" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.022015 4735 scope.go:117] "RemoveContainer" containerID="b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49" Oct 08 15:05:59 crc kubenswrapper[4735]: E1008 15:05:59.022385 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49\": container with ID starting with b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49 not found: ID does not exist" containerID="b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.022408 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49"} err="failed to get container status \"b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49\": rpc error: code = NotFound desc = could not find container \"b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49\": container with ID starting with b32d42931dfac2be87f2f90600b1c76dff6ccb120788f1968cb3fe2d0da43c49 not found: ID does not exist" Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.234307 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8mxz5"] Oct 08 15:05:59 crc kubenswrapper[4735]: I1008 15:05:59.248117 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8mxz5"] Oct 08 15:06:00 crc kubenswrapper[4735]: I1008 15:06:00.931858 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eabf4616-2e6c-412c-b059-329195b10ee7" path="/var/lib/kubelet/pods/eabf4616-2e6c-412c-b059-329195b10ee7/volumes" Oct 08 15:06:05 crc kubenswrapper[4735]: E1008 15:06:05.508266 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-hostnamed.service\": RecentStats: unable to find data in memory cache]" Oct 08 15:07:15 crc kubenswrapper[4735]: I1008 15:07:15.552142 4735 scope.go:117] "RemoveContainer" containerID="f1082ff8f57254f5ae1ba2188440b3f549999a8b5f5958b3380e41ec748394eb" Oct 08 15:07:24 crc kubenswrapper[4735]: I1008 15:07:24.878495 4735 patch_prober.go:28] interesting pod/machine-config-daemon-9qwgd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 15:07:24 crc kubenswrapper[4735]: I1008 15:07:24.879077 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9qwgd" podUID="2dfd0474-f64a-4d3b-9990-df182d079b10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"